1 00:00:00,690 --> 00:00:28,880 [Music] 2 00:00:35,310 --> 00:00:32,190 greetings to our friends of astrobiology 3 00:00:37,170 --> 00:00:35,320 and welcome to ask an astrobiologist the 4 00:00:38,880 --> 00:00:37,180 show that celebrates the science and the 5 00:00:41,940 --> 00:00:38,890 scientists involved in the realm of 6 00:00:44,600 --> 00:00:41,950 astrobiology and today we're going a bit 7 00:00:46,830 --> 00:00:44,610 beyond that even into the philosophy of 8 00:00:48,360 --> 00:00:46,840 astrobiology but before we get to 9 00:00:50,369 --> 00:00:48,370 today's guests and have an awesome 10 00:00:52,229 --> 00:00:50,379 conversation in that round we have some 11 00:00:54,960 --> 00:00:52,239 of our fun little things that we do 12 00:00:57,990 --> 00:00:54,970 every month for starters we have our 13 00:00:59,880 --> 00:00:58,000 background quiz so as our longtime 14 00:01:02,460 --> 00:00:59,890 viewers know we have a picture behind 15 00:01:04,829 --> 00:01:02,470 our host every single month like this 16 00:01:06,840 --> 00:01:04,839 wonderful picture behind me and it's up 17 00:01:09,600 --> 00:01:06,850 to you in the next month to guess what 18 00:01:11,730 --> 00:01:09,610 that image is so right now our producer 19 00:01:14,130 --> 00:01:11,740 and director Mike Toyota's gonna put up 20 00:01:16,649 --> 00:01:14,140 last month's picture onto the screen and 21 00:01:18,990 --> 00:01:16,659 your job was to guess what this image is 22 00:01:22,050 --> 00:01:19,000 showing that's actually fortescue falls 23 00:01:24,750 --> 00:01:22,060 in Western Australia in karijini 24 00:01:27,990 --> 00:01:24,760 national park you can see red staining 25 00:01:30,510 --> 00:01:28,000 of iron rich and oxide rich materials 26 00:01:32,970 --> 00:01:30,520 around the outside of that Canyon where 27 00:01:34,950 --> 00:01:32,980 waterfalls are flowing down this is part 28 00:01:37,200 --> 00:01:34,960 of an region that has a lot of banded 29 00:01:39,780 --> 00:01:37,210 iron formations these iron-rich 30 00:01:42,900 --> 00:01:39,790 structures that teach us a lot about the 31 00:01:45,390 --> 00:01:42,910 ancient history of the earth so we had 32 00:01:49,140 --> 00:01:45,400 several right guesses but this month our 33 00:01:53,010 --> 00:01:49,150 winner is Ben javi Dao or at beam javi 34 00:01:55,670 --> 00:01:53,020 Dao on Twitter and our winners always 35 00:01:57,960 --> 00:01:55,680 get some NASA stickers and some of our 36 00:01:59,880 --> 00:01:57,970 astrobiology graphic history books from 37 00:02:02,810 --> 00:01:59,890 last month's guest Aaron Gronstal 38 00:02:05,490 --> 00:02:02,820 so congratulations for Ben ha for that 39 00:02:07,980 --> 00:02:05,500 also we like to celebrate our friends 40 00:02:10,259 --> 00:02:07,990 out there who share on Twitter and 41 00:02:11,570 --> 00:02:10,269 Facebook and Instagram and LinkedIn and 42 00:02:14,460 --> 00:02:11,580 read it in all these awesome places 43 00:02:17,040 --> 00:02:14,470 about our show about our guests and the 44 00:02:19,530 --> 00:02:17,050 awesome work they do so this month our 45 00:02:23,130 --> 00:02:19,540 ambassador of the month was actually 46 00:02:25,410 --> 00:02:23,140 doing a Roker or a rotating curation of 47 00:02:28,619 --> 00:02:25,420 an account called space of people on 48 00:02:32,460 --> 00:02:28,629 Twitter that user is Sara McIntyre or 49 00:02:34,979 --> 00:02:32,470 @xo bio Explorer so Sara thank you so 50 00:02:36,330 --> 00:02:34,989 much for doing all of your hard work for 51 00:02:39,240 --> 00:02:36,340 sharing so much awesome stuff about 52 00:02:41,800 --> 00:02:39,250 space and exoplanets at people of space 53 00:02:43,990 --> 00:02:41,810 and for being such a cool friend 54 00:02:46,750 --> 00:02:44,000 our show we really really appreciate it 55 00:02:49,270 --> 00:02:46,760 now with that out of the way I have the 56 00:02:53,140 --> 00:02:49,280 awesome chance to introduce this month's 57 00:02:55,840 --> 00:02:53,150 guests dr. Susan Schneider is joining us 58 00:02:58,150 --> 00:02:55,850 she is the current NASA Barret Bloomberg 59 00:03:00,520 --> 00:02:58,160 chair in astrobiology at the Library of 60 00:03:02,559 --> 00:03:00,530 Congress an associate professor of 61 00:03:05,350 --> 00:03:02,569 philosophy and cognitive science at 62 00:03:08,760 --> 00:03:05,360 UConn and also the director of the AI 63 00:03:11,470 --> 00:03:08,770 mind and society or Ames group at UConn 64 00:03:13,870 --> 00:03:11,480 you've seen her before in talks like at 65 00:03:16,120 --> 00:03:13,880 TEDx and Google talks she's one on other 66 00:03:18,250 --> 00:03:16,130 podcasts and shows like star talk radio 67 00:03:19,870 --> 00:03:18,260 she's written in New York Times and 68 00:03:21,759 --> 00:03:19,880 Scientific American and have several 69 00:03:23,710 --> 00:03:21,769 awesome books out there and I'm really 70 00:03:25,900 --> 00:03:23,720 really excited for our conversation 71 00:03:28,660 --> 00:03:25,910 today so dr. Schneider thank you for 72 00:03:31,210 --> 00:03:28,670 joining ask an astrobiologist thanks for 73 00:03:32,440 --> 00:03:31,220 having me yeah it's great that you could 74 00:03:34,870 --> 00:03:32,450 join us and like I said when we first 75 00:03:37,020 --> 00:03:34,880 started off here normally our show is 76 00:03:38,979 --> 00:03:37,030 about the science and scientists of 77 00:03:41,559 --> 00:03:38,989 astrobiology but that was as you pointed 78 00:03:42,699 --> 00:03:41,569 out on Facebook and other places you're 79 00:03:44,170 --> 00:03:42,709 you don't consider yourself an 80 00:03:46,780 --> 00:03:44,180 astrobiologist but actually a 81 00:03:53,229 --> 00:03:46,790 philosopher right well that's what my 82 00:03:54,610 --> 00:03:53,239 PhD is in awesome I love it I mean we're 83 00:03:56,350 --> 00:03:54,620 gonna continue consider you an 84 00:03:58,509 --> 00:03:56,360 astrobiologist because you are the chair 85 00:04:00,069 --> 00:03:58,519 of astrobiology at the LSE 86 00:04:02,590 --> 00:04:00,079 and you considered astrobiology which is 87 00:04:05,350 --> 00:04:02,600 pretty awesome I love astrobiology I've 88 00:04:08,319 --> 00:04:05,360 been having so much fun and the chair at 89 00:04:12,580 --> 00:04:08,329 the Library of Congress allows for 90 00:04:15,009 --> 00:04:12,590 people who are philosophers you know and 91 00:04:18,009 --> 00:04:15,019 cognitive scientists people in any kind 92 00:04:20,620 --> 00:04:18,019 of field to get involved I love it yeah 93 00:04:22,300 --> 00:04:20,630 and I think sometimes people some of our 94 00:04:24,670 --> 00:04:22,310 fans of our show for instance who want 95 00:04:26,580 --> 00:04:24,680 to become astrobiologists they think 96 00:04:28,960 --> 00:04:26,590 they have to get there by going through 97 00:04:31,360 --> 00:04:28,970 traditional science degrees in chemistry 98 00:04:33,610 --> 00:04:31,370 or biology or physics but there's also 99 00:04:35,920 --> 00:04:33,620 many other ways in philosophy and other 100 00:04:37,930 --> 00:04:35,930 directions you can be involved in the 101 00:04:39,520 --> 00:04:37,940 realm of astrobiology so it's so cool to 102 00:04:44,020 --> 00:04:39,530 have you as the chair of astrobiology 103 00:04:46,180 --> 00:04:44,030 currently I was lucky because a few 104 00:04:48,850 --> 00:04:46,190 years ago NASA gave me a lot of training 105 00:04:50,529 --> 00:04:48,860 in astrobiology so you know if people do 106 00:04:52,719 --> 00:04:50,539 want to go the astrobiology route I 107 00:04:55,480 --> 00:04:52,729 think they do need a ton of science 108 00:04:57,610 --> 00:04:55,490 classes but the philosophy 109 00:05:01,090 --> 00:04:57,620 it's a lot of fun and I think a lot of 110 00:05:03,520 --> 00:05:01,100 the issues that are raised for instance 111 00:05:06,640 --> 00:05:03,530 Fermi paradox the question of how we 112 00:05:10,809 --> 00:05:06,650 would detect life do connect up with 113 00:05:12,370 --> 00:05:10,819 issues in philosophy so absolutely yeah 114 00:05:14,260 --> 00:05:12,380 and I think for myself in my own 115 00:05:15,790 --> 00:05:14,270 training as a scientist I took many 116 00:05:17,620 --> 00:05:15,800 classes in philosophy and read a lot of 117 00:05:19,510 --> 00:05:17,630 works in philosophy and I think that 118 00:05:21,339 --> 00:05:19,520 really helped kind of with myself just 119 00:05:23,589 --> 00:05:21,349 gauging the questions I was asking as a 120 00:05:25,059 --> 00:05:23,599 scientist and I know that there are 121 00:05:26,710 --> 00:05:25,069 there are many many philosophers who are 122 00:05:29,080 --> 00:05:26,720 now getting involved in astrobiology in 123 00:05:30,850 --> 00:05:29,090 various ways from asking questions like 124 00:05:32,800 --> 00:05:30,860 can we define life or like you said you 125 00:05:34,029 --> 00:05:32,810 know asking the Fermi paradox and trying 126 00:05:37,150 --> 00:05:34,039 to find answers to some of these larger 127 00:05:39,100 --> 00:05:37,160 questions yeah that's right it's 128 00:05:41,589 --> 00:05:39,110 exciting to see that productive 129 00:05:43,540 --> 00:05:41,599 interdisciplinary interaction I found 130 00:05:46,480 --> 00:05:43,550 after biologists wonderfully 131 00:05:49,420 --> 00:05:46,490 interdisciplinary I thought it's awesome 132 00:05:52,390 --> 00:05:49,430 I'm so excited and stoked I have to 133 00:05:54,879 --> 00:05:52,400 admit I read and listened to artificial 134 00:05:56,260 --> 00:05:54,889 you your most recent book and there's so 135 00:05:58,059 --> 00:05:56,270 much from that book that I really really 136 00:06:00,040 --> 00:05:58,069 want to talk about but before we get 137 00:06:02,439 --> 00:06:00,050 there I'd love for our audience to hear 138 00:06:05,050 --> 00:06:02,449 a little bit if you could about your 139 00:06:08,159 --> 00:06:05,060 early life and education and what what 140 00:06:13,290 --> 00:06:08,169 took you towards becoming a philosopher 141 00:06:16,959 --> 00:06:13,300 yeah I had a sort of unusual route I 142 00:06:19,420 --> 00:06:16,969 went to UC Berkeley because I grew up in 143 00:06:24,370 --> 00:06:19,430 Northern California and I loved the 144 00:06:27,010 --> 00:06:24,380 campus and I majored a very practical 145 00:06:32,520 --> 00:06:27,020 field probably because of my family 146 00:06:40,180 --> 00:06:32,530 background I was an economics major and 147 00:06:43,270 --> 00:06:40,190 so um I went to Eastern Europe to do a 148 00:06:47,770 --> 00:06:43,280 year abroad at the crow Marx University 149 00:06:50,050 --> 00:06:47,780 of Economics in Budapest Hungary so that 150 00:06:52,890 --> 00:06:50,060 was part of the Eastern Bloc and that 151 00:06:55,480 --> 00:06:52,900 was so intriguing because we had 152 00:06:57,100 --> 00:06:55,490 basically a lot of professors interested 153 00:06:58,770 --> 00:06:57,110 in philosophy who had been banned from 154 00:07:01,180 --> 00:06:58,780 teaching the Hungarians 155 00:07:03,939 --> 00:07:01,190 because of their views because you know 156 00:07:06,430 --> 00:07:03,949 under communism the only philosophy you 157 00:07:08,890 --> 00:07:06,440 ever read was that which was written by 158 00:07:11,830 --> 00:07:08,900 Marx and Engels right 159 00:07:14,170 --> 00:07:11,840 yeah no kidding it's like the philosophy 160 00:07:16,360 --> 00:07:14,180 underground we read people like Michel 161 00:07:19,930 --> 00:07:16,370 Foucault Frederick Nietzsche and then 162 00:07:21,310 --> 00:07:19,940 sociologists like Erving Goffman and at 163 00:07:23,170 --> 00:07:21,320 the same time I was doing work in 164 00:07:28,270 --> 00:07:23,180 economics with people who were doing the 165 00:07:30,640 --> 00:07:28,280 five-year plans over there and anyway so 166 00:07:32,800 --> 00:07:30,650 if I'm really into philosophy and then I 167 00:07:34,540 --> 00:07:32,810 went back to UC Berkeley and it turned 168 00:07:37,120 --> 00:07:34,550 out that they have a really good 169 00:07:42,100 --> 00:07:37,130 philosophy program and so I started 170 00:07:45,550 --> 00:07:42,110 working with philosophers in different 171 00:07:48,220 --> 00:07:45,560 areas and my interest just over the time 172 00:07:50,320 --> 00:07:48,230 became interest in the boundary between 173 00:07:53,260 --> 00:07:50,330 science and philosophy and also the 174 00:07:55,030 --> 00:07:53,270 boundary between science and value sort 175 00:07:57,970 --> 00:07:55,040 of understanding the scope and limits of 176 00:08:00,280 --> 00:07:57,980 human knowledge understanding what we 177 00:08:03,490 --> 00:08:00,290 can't justify from within the realm of 178 00:08:06,760 --> 00:08:03,500 science versus what we could discuss in 179 00:08:10,210 --> 00:08:06,770 the realm of philosophy and so it was 180 00:08:13,630 --> 00:08:10,220 natural for me when a few years ago I 181 00:08:15,940 --> 00:08:13,640 learned of an interesting program that 182 00:08:17,890 --> 00:08:15,950 NASA was putting on to try to get people 183 00:08:21,250 --> 00:08:17,900 into astrobiology from different 184 00:08:24,250 --> 00:08:21,260 disciplines and it was near my house and 185 00:08:26,020 --> 00:08:24,260 so the center theological inquiry was 186 00:08:29,460 --> 00:08:26,030 kind enough to host me for two years and 187 00:08:31,780 --> 00:08:29,470 they brought in all these amazing 188 00:08:33,610 --> 00:08:31,790 astrobiologists you know working on a 189 00:08:35,800 --> 00:08:33,620 range of topics 190 00:08:39,810 --> 00:08:35,810 you know life on how do you find life on 191 00:08:43,690 --> 00:08:39,820 other planets you know issues involving 192 00:08:45,550 --> 00:08:43,700 synthetic biology you know and so I 193 00:08:46,900 --> 00:08:45,560 really got involved in those debates I 194 00:08:49,380 --> 00:08:46,910 got involved pretty quickly because 195 00:08:50,950 --> 00:08:49,390 there were so many philosophical issues 196 00:08:53,530 --> 00:08:50,960 that's awesome 197 00:08:55,270 --> 00:08:53,540 and one think great thing in the field 198 00:08:56,710 --> 00:08:55,280 of astrobiology is we have so many 199 00:08:59,050 --> 00:08:56,720 people coming from these different areas 200 00:09:00,400 --> 00:08:59,060 we need common language to speak to each 201 00:09:02,170 --> 00:09:00,410 other and a lot of times that comes down 202 00:09:03,790 --> 00:09:02,180 to how we actually ask our questions how 203 00:09:06,070 --> 00:09:03,800 we share our knowledge and a lot of that 204 00:09:07,750 --> 00:09:06,080 comes from philosophy and in 205 00:09:09,520 --> 00:09:07,760 astrobiology then we have this issue of 206 00:09:12,040 --> 00:09:09,530 like what is life you know trying to 207 00:09:15,370 --> 00:09:12,050 define life you've spent a lot of time 208 00:09:16,270 --> 00:09:15,380 studying the mind and the self and some 209 00:09:18,340 --> 00:09:16,280 of these questions of what is 210 00:09:19,990 --> 00:09:18,350 consciousness which just like the 211 00:09:22,510 --> 00:09:20,000 question of what is life is kind of a 212 00:09:25,000 --> 00:09:22,520 nebulous realm that we really don't have 213 00:09:26,770 --> 00:09:25,010 great explanation for I wonder if you 214 00:09:28,060 --> 00:09:26,780 could speak just for a moment about some 215 00:09:31,000 --> 00:09:28,070 of your work in that realm of trying to 216 00:09:35,580 --> 00:09:31,010 understand what consciousness is yeah 217 00:09:37,930 --> 00:09:35,590 it's so vexing right and like 218 00:09:39,760 --> 00:09:37,940 astrobiology the quest to understand 219 00:09:43,900 --> 00:09:39,770 consciousness is asking about mike's 220 00:09:46,690 --> 00:09:43,910 ultimate questions right why are we here 221 00:09:50,310 --> 00:09:46,700 well consciousness is the belt quality 222 00:09:53,310 --> 00:09:50,320 of experience right so when you're 223 00:09:56,200 --> 00:09:53,320 selling your Roma your espresso shots 224 00:09:59,650 --> 00:09:56,210 when you're hearing your dog bark these 225 00:10:04,030 --> 00:09:59,660 are inner experiences my dogs gonna go 226 00:10:07,210 --> 00:10:04,040 off Sue and start barking by the way he 227 00:10:10,060 --> 00:10:07,220 looks happy to be you right and no one 228 00:10:11,500 --> 00:10:10,070 can jump inside of your head and know 229 00:10:14,200 --> 00:10:11,510 exactly what it feels like to be you 230 00:10:17,200 --> 00:10:14,210 similarly we assume other people have 231 00:10:19,180 --> 00:10:17,210 feelings it feels like something to be 232 00:10:22,030 --> 00:10:19,190 there so consciousness is that felt 233 00:10:24,610 --> 00:10:22,040 quality of experience and it's natural 234 00:10:26,560 --> 00:10:24,620 to believe that non-human animals like 235 00:10:28,510 --> 00:10:26,570 our dogs and cats and you know it's hard 236 00:10:30,880 --> 00:10:28,520 sometimes to know we're in the animal 237 00:10:33,640 --> 00:10:30,890 kingdom consciousness begins probably 238 00:10:36,250 --> 00:10:33,650 gradual you know we can't find a firm 239 00:10:37,930 --> 00:10:36,260 place but we have a sense that it feels 240 00:10:40,540 --> 00:10:37,940 like something to be a nonhuman animal 241 00:10:44,130 --> 00:10:40,550 as well and in a similar way maybe it 242 00:10:46,090 --> 00:10:44,140 feels like something to be an alien it's 243 00:10:48,040 --> 00:10:46,100 it's a good question right because we 244 00:10:50,440 --> 00:10:48,050 don't know when we first meet aliens if 245 00:10:52,480 --> 00:10:50,450 we meet alien life what they're going to 246 00:10:54,610 --> 00:10:52,490 be like and whether or not they kind of 247 00:10:56,320 --> 00:10:54,620 have the same kind of experiences and 248 00:10:58,810 --> 00:10:56,330 feelings and understandings of their 249 00:11:00,610 --> 00:10:58,820 place in the cosmos as we do and for 250 00:11:03,880 --> 00:11:00,620 some people that becomes very terrifying 251 00:11:05,830 --> 00:11:03,890 very quickly and so you've also worked a 252 00:11:07,300 --> 00:11:05,840 lot in what the future of our species 253 00:11:09,220 --> 00:11:07,310 might be what the future of other 254 00:11:11,530 --> 00:11:09,230 biological species might be with regard 255 00:11:13,720 --> 00:11:11,540 to artificial intelligence super 256 00:11:16,300 --> 00:11:13,730 intelligence post biological beings and 257 00:11:19,240 --> 00:11:16,310 so I love first a little while to chat 258 00:11:21,520 --> 00:11:19,250 about some of those ideas do you think 259 00:11:25,840 --> 00:11:21,530 it's it's most likely that alien life 260 00:11:29,260 --> 00:11:25,850 would be post biological well the way 261 00:11:31,510 --> 00:11:29,270 that question was dispraise I probably 262 00:11:35,129 --> 00:11:31,520 say no but I'll tell you why okay and 263 00:11:39,389 --> 00:11:35,139 then you'll be a guess answer coming up 264 00:11:41,249 --> 00:11:39,399 I think alien life that we find and most 265 00:11:44,639 --> 00:11:41,259 alien life out there will probably be 266 00:11:46,319 --> 00:11:44,649 like microbial so it won't be anything 267 00:11:48,809 --> 00:11:46,329 at the level of intellectual 268 00:11:52,319 --> 00:11:48,819 sophistication to where it will kind of 269 00:11:57,349 --> 00:11:52,329 become hosts biological but now here's 270 00:12:00,599 --> 00:11:57,359 where we get to the yes so the post that 271 00:12:04,259 --> 00:12:00,609 civilizations do emerge throughout the 272 00:12:06,840 --> 00:12:04,269 universe so you know assume that life 273 00:12:09,359 --> 00:12:06,850 gets started a lot and that 274 00:12:11,400 --> 00:12:09,369 civilizations are capable of surviving 275 00:12:13,319 --> 00:12:11,410 their technological maturity now of 276 00:12:16,829 --> 00:12:13,329 course that's a big issue right and we 277 00:12:18,299 --> 00:12:16,839 already see potential existential risks 278 00:12:20,009 --> 00:12:18,309 with nuclear weapons 279 00:12:22,590 --> 00:12:20,019 global warming maybe that's just a 280 00:12:25,590 --> 00:12:22,600 catastrophic risk but you know on earth 281 00:12:27,689 --> 00:12:25,600 we see a lot of issues it may be that 282 00:12:30,509 --> 00:12:27,699 civilizations inevitably do stupid 283 00:12:34,739 --> 00:12:30,519 things and don't survive sorry to be 284 00:12:38,039 --> 00:12:34,749 grim but if civilizations do survive 285 00:12:41,639 --> 00:12:38,049 their technological maturity and if life 286 00:12:44,699 --> 00:12:41,649 really is common out there right it may 287 00:12:46,409 --> 00:12:44,709 be that those civilizations are starting 288 00:12:48,150 --> 00:12:46,419 to develop their own computers and 289 00:12:50,069 --> 00:12:48,160 they're starting to enhance their own 290 00:12:52,499 --> 00:12:50,079 brains in radical ways so that they're 291 00:12:55,619 --> 00:12:52,509 augmenting their intelligence or it 292 00:13:00,289 --> 00:12:55,629 could be that the AIS they create 293 00:13:03,269 --> 00:13:00,299 eventually supplant them in these cases 294 00:13:07,019 --> 00:13:03,279 intelligent life tends to become most 295 00:13:09,479 --> 00:13:07,029 biological so here's my plan now to be 296 00:13:12,569 --> 00:13:09,489 very careful about my plan the most 297 00:13:15,569 --> 00:13:12,579 intelligent alien civilizations if they 298 00:13:17,609 --> 00:13:15,579 truly exist they will be post biological 299 00:13:21,689 --> 00:13:17,619 that's because I think that AI could 300 00:13:24,419 --> 00:13:21,699 outthink biological life-forms no 301 00:13:25,919 --> 00:13:24,429 intriguing yeah Fletcher put that up too 302 00:13:28,079 --> 00:13:25,929 because you know a lot of folks know of 303 00:13:29,789 --> 00:13:28,089 the Drake Equation for instance and we 304 00:13:32,159 --> 00:13:29,799 have this factor that I think is really 305 00:13:34,109 --> 00:13:32,169 educational for a lot of people that 306 00:13:36,779 --> 00:13:34,119 gives us an idea of the length of time 307 00:13:38,939 --> 00:13:36,789 that civilizations persist that they 308 00:13:40,679 --> 00:13:38,949 could be communicating via radio contact 309 00:13:43,259 --> 00:13:40,689 which is what the Drake Equation was 310 00:13:44,879 --> 00:13:43,269 initially for but so few of us ever 311 00:13:47,039 --> 00:13:44,889 considered that length of time to be 312 00:13:48,350 --> 00:13:47,049 also limited not just by a civilization 313 00:13:49,730 --> 00:13:48,360 destroying itself 314 00:13:51,710 --> 00:13:49,740 so a civilization becoming post 315 00:13:53,960 --> 00:13:51,720 biological or developing artificial 316 00:13:55,579 --> 00:13:53,970 intelligence or changing from being a 317 00:13:58,310 --> 00:13:55,589 biological intelligence to becoming 318 00:14:00,769 --> 00:13:58,320 something other so that's really really 319 00:14:03,920 --> 00:14:00,779 intriguing you bring that up you know to 320 00:14:07,790 --> 00:14:03,930 give credit it's due here Steven dick 321 00:14:10,100 --> 00:14:07,800 who was I believe one of the first NASA 322 00:14:11,960 --> 00:14:10,110 chairs in astrobiology has made this 323 00:14:15,139 --> 00:14:11,970 point he calls with the short window 324 00:14:17,150 --> 00:14:15,149 observation right so you know you only 325 00:14:19,100 --> 00:14:17,160 have this very short window before a 326 00:14:22,190 --> 00:14:19,110 civilization from the time say they're 327 00:14:24,650 --> 00:14:22,200 turning on their radios and whatnot to 328 00:14:27,350 --> 00:14:24,660 the time at which they become hosts 329 00:14:31,730 --> 00:14:27,360 biological of course that assumes that 330 00:14:35,269 --> 00:14:31,740 life is really out there I mean you know 331 00:14:37,579 --> 00:14:35,279 I hope it is I think I think many of us 332 00:14:39,530 --> 00:14:37,589 here watching as well are very hopeful 333 00:14:40,970 --> 00:14:39,540 we grew up dreaming you know and 334 00:14:42,500 --> 00:14:40,980 watching a science fiction and wondering 335 00:14:44,540 --> 00:14:42,510 are we are we alone is there something 336 00:14:47,030 --> 00:14:44,550 out there and and trying to guess what 337 00:14:48,500 --> 00:14:47,040 it might be like and you actually don't 338 00:14:50,360 --> 00:14:48,510 know a lot that in studying philosophy 339 00:14:52,519 --> 00:14:50,370 and science fiction as well and some of 340 00:14:54,710 --> 00:14:52,529 these ideas we presented to ourselves 341 00:14:56,060 --> 00:14:54,720 what's possible I understand that you 342 00:14:59,560 --> 00:14:56,070 also wrote a book on philosophy and 343 00:15:02,530 --> 00:14:59,570 science fiction oh yeah when I was a 344 00:15:05,420 --> 00:15:02,540 graduate student I was teaching classes 345 00:15:07,689 --> 00:15:05,430 on that topic my students loved the 346 00:15:10,639 --> 00:15:07,699 class and so I did an anthology on and 347 00:15:12,590 --> 00:15:10,649 it's you know they put out new editions 348 00:15:15,740 --> 00:15:12,600 a lot because a lot of people still 349 00:15:18,710 --> 00:15:15,750 assign it and what I do is I blended 350 00:15:21,590 --> 00:15:18,720 together pieces by famous science 351 00:15:25,400 --> 00:15:21,600 fiction writers like Isaac Asimov and 352 00:15:28,040 --> 00:15:25,410 others and then I paired it up with a 353 00:15:32,050 --> 00:15:28,050 bunch of kind of papers and philosophy 354 00:15:34,910 --> 00:15:32,060 some in astrobiology some I wrote and 355 00:15:36,920 --> 00:15:34,920 you know it's fun it's a good route in I 356 00:15:39,560 --> 00:15:36,930 think you would technical common 357 00:15:41,750 --> 00:15:39,570 languages well I think science fiction 358 00:15:44,540 --> 00:15:41,760 is the kind of lingua franca with a lot 359 00:15:46,759 --> 00:15:44,550 of you know philosophers and in 360 00:15:49,610 --> 00:15:46,769 scientists so it's really easy to get 361 00:15:51,290 --> 00:15:49,620 started talking about issues of common 362 00:15:54,590 --> 00:15:51,300 interests we were able to just point to 363 00:15:55,759 --> 00:15:54,600 a film very quickly or you know a short 364 00:15:58,189 --> 00:15:55,769 story or something 365 00:15:59,600 --> 00:15:58,199 you know absolutely and when I was a 366 00:16:01,520 --> 00:15:59,610 student here at the University Colorado 367 00:16:03,920 --> 00:16:01,530 where I'm broadcasting from right now 368 00:16:05,660 --> 00:16:03,930 one of my favorite classes it wasn't 369 00:16:07,520 --> 00:16:05,670 planetary geology or biochemistry 370 00:16:09,020 --> 00:16:07,530 related my favorite course was a 371 00:16:11,060 --> 00:16:09,030 philosophy and science fiction course 372 00:16:12,320 --> 00:16:11,070 that we had here and kind of delving 373 00:16:14,120 --> 00:16:12,330 into some of these more philosophical 374 00:16:16,190 --> 00:16:14,130 issues and things that I thought about a 375 00:16:17,360 --> 00:16:16,200 lot and one thing in that course I'd 376 00:16:19,910 --> 00:16:17,370 like to bring up that that I thought 377 00:16:21,460 --> 00:16:19,920 about was was Frankenstein and Mary 378 00:16:24,440 --> 00:16:21,470 Shelley's Frankenstein which she 379 00:16:26,540 --> 00:16:24,450 subtitled the modern Prometheus we have 380 00:16:28,610 --> 00:16:26,550 this character of Victor Frankenstein he 381 00:16:31,130 --> 00:16:28,620 creates his creature and then it becomes 382 00:16:33,140 --> 00:16:31,140 the destruction of his life because he 383 00:16:35,540 --> 00:16:33,150 gave that creature the spark of fire the 384 00:16:37,940 --> 00:16:35,550 spark of life but didn't give it the 385 00:16:40,340 --> 00:16:37,950 love and the compassion and give it 386 00:16:42,080 --> 00:16:40,350 understanding of what it was and I 387 00:16:43,850 --> 00:16:42,090 wonder if in some way that relates to 388 00:16:46,580 --> 00:16:43,860 what we're looking at for the future 389 00:16:47,750 --> 00:16:46,590 with AI right now a lot of people who 390 00:16:49,160 --> 00:16:47,760 are worried about artificial 391 00:16:52,160 --> 00:16:49,170 intelligence in the future for our 392 00:16:55,430 --> 00:16:52,170 species are worried that if we don't do 393 00:16:57,800 --> 00:16:55,440 the work now to code in some level of 394 00:17:00,350 --> 00:16:57,810 morality or love or respect or 395 00:17:02,720 --> 00:17:00,360 understanding that kind of relates back 396 00:17:05,929 --> 00:17:02,730 to what we are as humans that we might 397 00:17:07,130 --> 00:17:05,939 lose control of that AI how often is 398 00:17:11,449 --> 00:17:07,140 that really discussed in the philosophy 399 00:17:14,990 --> 00:17:11,459 realm well I think it's discussed a lot 400 00:17:17,780 --> 00:17:15,000 now there was a book by me boström 401 00:17:19,840 --> 00:17:17,790 called super intelligence it came out 402 00:17:23,960 --> 00:17:19,850 about I think it's about four years ago 403 00:17:25,370 --> 00:17:23,970 that really raised these issues in a 404 00:17:26,870 --> 00:17:25,380 very public way it became a New York 405 00:17:29,510 --> 00:17:26,880 Times bestseller and really he was 406 00:17:31,940 --> 00:17:29,520 drawing from work by a lot of people 407 00:17:35,330 --> 00:17:31,950 that had been done over the last decade 408 00:17:37,280 --> 00:17:35,340 and one of the big issues that people 409 00:17:40,000 --> 00:17:37,290 worry about is the control problem and 410 00:17:42,650 --> 00:17:40,010 that's the problem it should we create 411 00:17:44,810 --> 00:17:42,660 human level intelligence or even 412 00:17:48,260 --> 00:17:44,820 intelligence it's not exactly like us 413 00:17:50,390 --> 00:17:48,270 but you know is intellectually 414 00:17:53,330 --> 00:17:50,400 sophisticated it could be that it 415 00:17:56,450 --> 00:17:53,340 quickly surpasses us and becomes super 416 00:17:57,830 --> 00:17:56,460 intelligent which is the name for a 417 00:17:59,840 --> 00:17:57,840 hypothetical form of artificial 418 00:18:01,940 --> 00:17:59,850 intelligence that outsmarts us in every 419 00:18:04,340 --> 00:18:01,950 respect and then we'll lose control of 420 00:18:07,730 --> 00:18:04,350 it because obviously if it's super 421 00:18:09,110 --> 00:18:07,740 intelligent it's already thought of the 422 00:18:11,120 --> 00:18:09,120 different ways that we'll try to control 423 00:18:13,580 --> 00:18:11,130 it and so what you want to do is try to 424 00:18:15,379 --> 00:18:13,590 from the get-go 425 00:18:18,169 --> 00:18:15,389 you know create our 426 00:18:21,889 --> 00:18:18,179 official intelligent systems that are 427 00:18:24,139 --> 00:18:21,899 human friendly and there's $90 going 428 00:18:26,749 --> 00:18:24,149 into the creation of human friendly AI 429 00:18:29,690 --> 00:18:26,759 it's tricky it's really tricky I mean 430 00:18:31,729 --> 00:18:29,700 you know just consider a bit ethicists 431 00:18:35,839 --> 00:18:31,739 don't even agree on what the correct 432 00:18:39,440 --> 00:18:35,849 ethical system is so if you want to code 433 00:18:40,940 --> 00:18:39,450 in rules which ones do you use and of 434 00:18:43,339 --> 00:18:40,950 course since the system super 435 00:18:45,019 --> 00:18:43,349 intelligent it's gonna just like rewrite 436 00:18:47,569 --> 00:18:45,029 them in fact it's gonna be rewriting 437 00:18:49,909 --> 00:18:47,579 itself it'll have what are called 438 00:18:52,069 --> 00:18:49,919 recursive self-improvement algorithms 439 00:18:54,079 --> 00:18:52,079 it'll be changing its architecture 440 00:18:56,269 --> 00:18:54,089 potentially in very fundamental ways and 441 00:18:59,629 --> 00:18:56,279 so you know you might think we'll just 442 00:19:01,969 --> 00:18:59,639 kind of give it give it like a childhood 443 00:19:04,009 --> 00:19:01,979 and see what it comes up with I mean 444 00:19:08,539 --> 00:19:04,019 it's a different strategies were being 445 00:19:11,839 --> 00:19:08,549 utilized right now is moving very 446 00:19:14,899 --> 00:19:11,849 quickly that's that's a thing right 447 00:19:17,810 --> 00:19:14,909 it could be in 30 years we have greater 448 00:19:19,789 --> 00:19:17,820 than human intelligence even my own 449 00:19:22,310 --> 00:19:19,799 views on this are pretty eating Socratic 450 00:19:24,259 --> 00:19:22,320 you know I think I don't even think 451 00:19:27,319 --> 00:19:24,269 we're gonna have technically human level 452 00:19:29,680 --> 00:19:27,329 intelligence I think we'll get smarter 453 00:19:32,449 --> 00:19:29,690 than human level intelligence 454 00:19:35,029 --> 00:19:32,459 interesting it reminds me of Isaac 455 00:19:37,310 --> 00:19:35,039 Asimov's I Robot as well a lot of folks 456 00:19:38,810 --> 00:19:37,320 loves to talk about these robot laws 457 00:19:41,479 --> 00:19:38,820 that he created for controlling robots 458 00:19:43,190 --> 00:19:41,489 but that the main point of the story was 459 00:19:44,899 --> 00:19:43,200 that the laws never worked the way that 460 00:19:47,299 --> 00:19:44,909 they were intended they always caused 461 00:19:49,399 --> 00:19:47,309 all of these paradoxes and catastrophes 462 00:19:51,109 --> 00:19:49,409 because the robots just didn't know how 463 00:19:52,999 --> 00:19:51,119 to handle having these control laws so 464 00:19:54,279 --> 00:19:53,009 that's a really interesting point that 465 00:19:57,319 --> 00:19:54,289 you know it's important to start now 466 00:19:59,569 --> 00:19:57,329 trying to have some control I do in 467 00:20:01,430 --> 00:19:59,579 remind our viewers right now if you want 468 00:20:03,589 --> 00:20:01,440 to ask questions in the second net chat 469 00:20:06,169 --> 00:20:03,599 window or NASA Astrobiology Facebook 470 00:20:07,369 --> 00:20:06,179 page you can do that also any questions 471 00:20:10,749 --> 00:20:07,379 for dr. Schneider can come through 472 00:20:12,919 --> 00:20:10,759 twitter using the hashtag ask Astro bio 473 00:20:14,419 --> 00:20:12,929 so we have a little more time just 474 00:20:16,849 --> 00:20:14,429 chatting you and I before I open it up 475 00:20:18,379 --> 00:20:16,859 to the audience questions I wonder if 476 00:20:21,289 --> 00:20:18,389 you could just tell us for a little bit 477 00:20:23,060 --> 00:20:21,299 about what your primary research was or 478 00:20:27,680 --> 00:20:23,070 your work was at the Library of Congress 479 00:20:29,180 --> 00:20:27,690 as the chair in astrobiology um so it's 480 00:20:32,359 --> 00:20:29,190 an ongoing care 481 00:20:34,430 --> 00:20:32,369 so a last semester I was the 482 00:20:36,710 --> 00:20:34,440 distinguished scholar at the library and 483 00:20:38,479 --> 00:20:36,720 I finished a book called artificial you 484 00:20:42,019 --> 00:20:38,489 AI in the future of your mind which is 485 00:20:45,259 --> 00:20:42,029 now available for sale and that had some 486 00:20:47,479 --> 00:20:45,269 chapters in astrobiology and talks about 487 00:20:48,229 --> 00:20:47,489 the issues that we just discussed and 488 00:20:51,139 --> 00:20:48,239 many others 489 00:20:55,339 --> 00:20:51,149 and now as chair I'm writing another 490 00:20:58,190 --> 00:20:55,349 book coming out with Norton with Oxford 491 00:21:01,519 --> 00:20:58,200 as the Commonwealth distributor for 492 00:21:04,759 --> 00:21:01,529 publishers and it's going to be on the 493 00:21:07,879 --> 00:21:04,769 space of intelligent system and I'm so 494 00:21:09,469 --> 00:21:07,889 excited about that because I'm a 495 00:21:11,389 --> 00:21:09,479 cognitive scientist as well as a 496 00:21:14,810 --> 00:21:11,399 philosopher and so now I get to think 497 00:21:18,049 --> 00:21:14,820 through the details of deep learning 498 00:21:20,320 --> 00:21:18,059 systems versus different kinds of 499 00:21:22,849 --> 00:21:20,330 biological systems such as the octopus 500 00:21:24,830 --> 00:21:22,859 versus the human and then what I'm 501 00:21:28,369 --> 00:21:24,840 looking at is the limits of human 502 00:21:30,440 --> 00:21:28,379 intelligence so in what ways can we 503 00:21:33,940 --> 00:21:30,450 ought meant the brain and will we 504 00:21:36,440 --> 00:21:33,950 ultimately be able to out think AI and 505 00:21:38,509 --> 00:21:36,450 what's the potential for biological 506 00:21:41,419 --> 00:21:38,519 organisms throughout the universe to 507 00:21:42,889 --> 00:21:41,429 outthink AI so this is fun because it's 508 00:21:45,799 --> 00:21:42,899 like cognitive science meets 509 00:21:47,989 --> 00:21:45,809 astrobiology meets philosophy and then 510 00:21:50,899 --> 00:21:47,999 meets a hip hearty dose of ethics 511 00:21:52,310 --> 00:21:50,909 because one of the main concerns that 512 00:21:55,849 --> 00:21:52,320 the later chapters is what should we 513 00:21:57,349 --> 00:21:55,859 value about intelligence systems and 514 00:21:59,930 --> 00:21:57,359 that which matters 515 00:22:01,580 --> 00:21:59,940 I look forward to reading that - I read 516 00:22:02,570 --> 00:22:01,590 artificial you and I'd look forward to 517 00:22:04,389 --> 00:22:02,580 reading this next book do you have a 518 00:22:06,039 --> 00:22:04,399 title for it yet but a chance 519 00:22:09,229 --> 00:22:06,049 tentatively called 520 00:22:12,289 --> 00:22:09,239 from bio to bit our place in a universe 521 00:22:16,879 --> 00:22:12,299 of intelligence systems ok love it 522 00:22:20,539 --> 00:22:16,889 sounds awesome work done on it why did 523 00:22:22,940 --> 00:22:20,549 you get over this stupid cold yeah that 524 00:22:24,139 --> 00:22:22,950 time of year we're flying or flying here 525 00:22:25,909 --> 00:22:24,149 a couple of weeks with my son for the 526 00:22:27,680 --> 00:22:25,919 very first time he's five months old and 527 00:22:30,169 --> 00:22:27,690 it's kind of worrisome right this time 528 00:22:32,149 --> 00:22:30,179 of year flying on airplanes with all the 529 00:22:34,129 --> 00:22:32,159 sickness in the air and the dry dryness 530 00:22:37,639 --> 00:22:34,139 of the air and stuff but should be 531 00:22:39,560 --> 00:22:37,649 exciting you know when I think of a baby 532 00:22:41,089 --> 00:22:39,570 on an airplane the first thing I think 533 00:22:44,870 --> 00:22:41,099 about it's not that they're gonna get 534 00:22:49,740 --> 00:22:47,490 beside us we'll be okay with the fact 535 00:22:51,600 --> 00:22:49,750 but I think he's only five months it's 536 00:22:53,310 --> 00:22:51,610 just the one and one-year-old kind of 537 00:22:56,190 --> 00:22:53,320 age I think where it's it's most 538 00:22:57,930 --> 00:22:56,200 worrisome so we'll see fingers crossed 539 00:22:59,430 --> 00:22:57,940 before we do open up the questions for 540 00:23:01,769 --> 00:22:59,440 our audience 541 00:23:04,620 --> 00:23:01,779 we always love just showcasing a bit 542 00:23:07,649 --> 00:23:04,630 about our guests beyond just their work 543 00:23:10,680 --> 00:23:07,659 relevant to astrobiology about their 544 00:23:13,289 --> 00:23:10,690 lives I understand in your case so for 545 00:23:15,840 --> 00:23:13,299 someone who studies how you know the 546 00:23:17,940 --> 00:23:15,850 mind and the self and the feeder future 547 00:23:20,340 --> 00:23:17,950 of artificial intelligence humanity and 548 00:23:22,380 --> 00:23:20,350 super intelligence and post biological 549 00:23:24,960 --> 00:23:22,390 aliens and all of these really kind of 550 00:23:28,620 --> 00:23:24,970 techno focused things I understand that 551 00:23:36,380 --> 00:23:28,630 you also live on an 1800s farm and you 552 00:23:40,409 --> 00:23:38,519 because you know if you write about all 553 00:23:42,000 --> 00:23:40,419 this I fight this topia stuff it's 554 00:23:45,840 --> 00:23:42,010 probably for the best that you don't 555 00:23:47,310 --> 00:23:45,850 live around all this technology internet 556 00:23:50,970 --> 00:23:47,320 and stuff but like you know all the 557 00:23:53,159 --> 00:23:50,980 furniture everything's old and I like 558 00:23:55,560 --> 00:23:53,169 just the peacefulness of it I'm pretty 559 00:23:58,019 --> 00:23:55,570 good cluesive actually I mean I don't 560 00:24:00,870 --> 00:23:58,029 know and I go out a lot I do think I 561 00:24:06,510 --> 00:24:00,880 like to come home to the farm you know 562 00:24:09,269 --> 00:24:06,520 service my husband you know that's 563 00:24:11,159 --> 00:24:09,279 awesome I'm curious you know do you find 564 00:24:13,710 --> 00:24:11,169 that engaging with nature by kind of 565 00:24:15,330 --> 00:24:13,720 being around these animals in some way 566 00:24:18,269 --> 00:24:15,340 kind of helps you to kind of frame your 567 00:24:20,580 --> 00:24:18,279 understanding of these ideas of self and 568 00:24:23,039 --> 00:24:20,590 consciousness and life and and aliens 569 00:24:27,480 --> 00:24:23,049 and those kind of things yeah actually 570 00:24:30,810 --> 00:24:27,490 um you know I've been really worried 571 00:24:33,029 --> 00:24:30,820 about Animal Liberation lately um and I 572 00:24:34,470 --> 00:24:33,039 know that you know proponents of Animal 573 00:24:37,740 --> 00:24:34,480 Liberation will say you're not supposed 574 00:24:39,269 --> 00:24:37,750 to have like pets and a Barb and you 575 00:24:43,230 --> 00:24:39,279 know I get it I understand your point 576 00:24:46,950 --> 00:24:43,240 but but um I think what happened for me 577 00:24:50,610 --> 00:24:46,960 was you know it's the humbler creatures 578 00:24:53,580 --> 00:24:50,620 they have they could have very intense 579 00:24:55,049 --> 00:24:53,590 conscious experience and when I'm 580 00:24:56,730 --> 00:24:55,059 sitting here studying the possibility 581 00:24:58,950 --> 00:24:56,740 that 582 00:25:01,919 --> 00:24:58,960 the most intelligent systems may not 583 00:25:07,470 --> 00:25:01,929 even be conscious which is basically the 584 00:25:09,720 --> 00:25:07,480 upshot of part of my last book and then 585 00:25:13,560 --> 00:25:09,730 you contrast that to what's what's in 586 00:25:16,139 --> 00:25:13,570 the paddock right it becomes very 587 00:25:18,960 --> 00:25:16,149 important to try to reduce the amount of 588 00:25:20,489 --> 00:25:18,970 suffering on the planet you know you 589 00:25:24,139 --> 00:25:20,499 start to really care very deeply about 590 00:25:26,399 --> 00:25:24,149 that and in a way the relationship that 591 00:25:28,529 --> 00:25:26,409 advanced artificial intelligence could 592 00:25:32,070 --> 00:25:28,539 have to an unenhanced human could be 593 00:25:33,960 --> 00:25:32,080 quite similar to the way we relate to 594 00:25:37,350 --> 00:25:33,970 non-human animals so we should be 595 00:25:40,109 --> 00:25:37,360 careful the way we treat non-human 596 00:25:41,909 --> 00:25:40,119 animals so obviously like you know farm 597 00:25:45,029 --> 00:25:41,919 it taste it spoil be animal farm 598 00:25:46,980 --> 00:25:45,039 obviously yeah that's awesome let's we 599 00:25:49,619 --> 00:25:46,990 be judged by our robot overlords for how 600 00:25:50,820 --> 00:25:49,629 we treated other organisms and it is 601 00:25:53,039 --> 00:25:50,830 interesting you know we have this full 602 00:25:56,009 --> 00:25:53,049 range of human interaction with other 603 00:25:57,570 --> 00:25:56,019 organisms on our own planet and even 604 00:25:59,340 --> 00:25:57,580 though we have the N equals one problem 605 00:26:01,379 --> 00:25:59,350 you know what given this range of ways 606 00:26:03,139 --> 00:26:01,389 that we interact with other organisms I 607 00:26:05,100 --> 00:26:03,149 think it's a lot of fodder for us 608 00:26:07,619 --> 00:26:05,110 astrobiologists and philosophers to 609 00:26:10,080 --> 00:26:07,629 think about how will interact with alien 610 00:26:11,730 --> 00:26:10,090 life when we finally meet it whether it 611 00:26:14,220 --> 00:26:11,740 will be a benevolent malevolent or 612 00:26:16,109 --> 00:26:14,230 somewhere in between those are really 613 00:26:18,570 --> 00:26:16,119 important points yeah and I actually 614 00:26:21,810 --> 00:26:18,580 have a position on that it's different 615 00:26:26,190 --> 00:26:21,820 from a lot of the astrobiologists who 616 00:26:28,619 --> 00:26:26,200 talk about post biological existence so 617 00:26:30,600 --> 00:26:28,629 you know a lot of people who like 618 00:26:33,899 --> 00:26:30,610 Stephen dick and Paul Davies and seth 619 00:26:36,619 --> 00:26:33,909 shostack many others they they're 620 00:26:38,820 --> 00:26:36,629 sympathetic to the idea that 621 00:26:42,930 --> 00:26:38,830 intelligence will evolve to become post 622 00:26:45,629 --> 00:26:42,940 biological okay but a lot of these 623 00:26:49,440 --> 00:26:45,639 individuals are also interested in a 624 00:26:52,049 --> 00:26:49,450 more active steady approach like seth 625 00:26:54,269 --> 00:26:52,059 paradigmatically right he's at the SETI 626 00:26:58,049 --> 00:26:54,279 Institute you know people who believe in 627 00:27:01,139 --> 00:26:58,059 active steady they want to actually call 628 00:27:04,769 --> 00:27:01,149 attention to earth they don't just want 629 00:27:06,539 --> 00:27:04,779 to passively listen right well my 630 00:27:08,759 --> 00:27:06,549 approach is a little different think 631 00:27:10,200 --> 00:27:08,769 about what we said a few minutes ago 632 00:27:11,850 --> 00:27:10,210 about the 633 00:27:15,750 --> 00:27:11,860 possibility of losing control of 634 00:27:19,020 --> 00:27:15,760 super-intelligent AI on earth so if your 635 00:27:21,270 --> 00:27:19,030 concerns that that issue and you believe 636 00:27:24,210 --> 00:27:21,280 that civilizations elsewhere could be 637 00:27:26,070 --> 00:27:24,220 post biological the next step should be 638 00:27:28,710 --> 00:27:26,080 something like cloaking devices not 639 00:27:30,570 --> 00:27:28,720 active setting and passive listening 640 00:27:33,980 --> 00:27:30,580 right and that's not to diminish the 641 00:27:37,940 --> 00:27:33,990 import of the quest for finding life on 642 00:27:40,620 --> 00:27:37,950 100% behind that I'm so excited about it 643 00:27:42,030 --> 00:27:40,630 but I think we have to be very cautious 644 00:27:45,420 --> 00:27:42,040 because we don't know what's out there 645 00:27:47,990 --> 00:27:45,430 right now that means so much more 646 00:27:50,760 --> 00:27:48,000 advanced than us that we are so boring 647 00:27:52,530 --> 00:27:50,770 well we would never know I mean well if 648 00:27:53,880 --> 00:27:52,540 maybe we're in a zoo as some have talked 649 00:27:56,640 --> 00:27:53,890 about and they're just waiting for us to 650 00:27:58,710 --> 00:27:56,650 become more advanced we had Jill tarter 651 00:28:00,780 --> 00:27:58,720 on the show in a previous episode and 652 00:28:03,420 --> 00:28:00,790 she also kind of has that position that 653 00:28:04,860 --> 00:28:03,430 were not quite emotionally mature enough 654 00:28:06,420 --> 00:28:04,870 and and technologically mature enough 655 00:28:10,380 --> 00:28:06,430 yet to actually start broadcasting out 656 00:28:12,180 --> 00:28:10,390 for our audience this active SETI is 657 00:28:15,060 --> 00:28:12,190 also sometimes called Medi or messaging 658 00:28:17,010 --> 00:28:15,070 extraterrestrial intelligence the thing 659 00:28:19,020 --> 00:28:17,020 is regardless of some of our 660 00:28:21,210 --> 00:28:19,030 philosophical arguments for or against 661 00:28:24,060 --> 00:28:21,220 there are folks who are already doing 662 00:28:25,590 --> 00:28:24,070 this so for those who aren't aware there 663 00:28:28,650 --> 00:28:25,600 is a group called Metis and they already 664 00:28:30,420 --> 00:28:28,660 are sending signals out into space which 665 00:28:31,710 --> 00:28:30,430 brings up a point my friend Mike toyou 666 00:28:34,020 --> 00:28:31,720 know our producer director of the show 667 00:28:35,010 --> 00:28:34,030 and I love to talk about a lot from 668 00:28:37,650 --> 00:28:35,020 Lewis he Chen's 669 00:28:40,950 --> 00:28:37,660 science fiction novels this idea of the 670 00:28:44,190 --> 00:28:40,960 dark forest if you are alone at night in 671 00:28:46,650 --> 00:28:44,200 the dark forest should you call out for 672 00:28:49,170 --> 00:28:46,660 help or to meet other hunters or should 673 00:28:51,060 --> 00:28:49,180 you be quiet and fear that the animals 674 00:28:52,830 --> 00:28:51,070 of the night might come and attack you 675 00:28:55,130 --> 00:28:52,840 and so it's a very important question 676 00:28:58,340 --> 00:28:55,140 are we setting ourselves up for 677 00:29:00,360 --> 00:28:58,350 catastrophe by sending messages out now 678 00:29:02,100 --> 00:29:00,370 so I'm very glad you brought that point 679 00:29:04,350 --> 00:29:02,110 up dr. Schneider 680 00:29:05,700 --> 00:29:04,360 for our audience at home who are 681 00:29:08,220 --> 00:29:05,710 watching right now you can ask questions 682 00:29:10,290 --> 00:29:08,230 now in second net in the chat on the 683 00:29:13,230 --> 00:29:10,300 NASA Astrobiology Facebook page and on 684 00:29:14,520 --> 00:29:13,240 twitter using hashtag ask Astro bio and 685 00:29:18,750 --> 00:29:14,530 we'll start moving into some of these 686 00:29:21,810 --> 00:29:18,760 questions now from our audience so the 687 00:29:23,940 --> 00:29:21,820 first one comes from my friend Jim Paz 688 00:29:25,710 --> 00:29:23,950 he's the director of the the CEO 689 00:29:30,480 --> 00:29:25,720 of the Astro sociology Research 690 00:29:32,550 --> 00:29:30,490 Institute or a RI Jim asks as AI becomes 691 00:29:34,380 --> 00:29:32,560 more sophisticated and longer duration 692 00:29:35,790 --> 00:29:34,390 space travel and permanent settlement 693 00:29:38,400 --> 00:29:35,800 continue to move forward 694 00:29:40,670 --> 00:29:38,410 how will the relationship between human 695 00:29:43,980 --> 00:29:40,680 and artificial intelligence evolve 696 00:29:46,980 --> 00:29:43,990 specifically dr. Paz wants to know how 697 00:29:52,140 --> 00:29:46,990 likely is a how scenario like in 2000 on 698 00:29:56,580 --> 00:29:52,150 a Space Odyssey oh man so it's hard to 699 00:29:59,400 --> 00:29:56,590 tell how likely that is you know really 700 00:30:03,780 --> 00:29:59,410 hard I mean I think we will end on our 701 00:30:05,520 --> 00:30:03,790 artificial intelligences and you know 702 00:30:08,430 --> 00:30:05,530 there are areas of computer science that 703 00:30:11,400 --> 00:30:08,440 seek to determine how trustworthy and 704 00:30:14,480 --> 00:30:11,410 verifiable programs are there are all 705 00:30:16,680 --> 00:30:14,490 kinds of problems right now with 706 00:30:19,290 --> 00:30:16,690 determining what exactly deep learning 707 00:30:22,940 --> 00:30:19,300 systems are even doing now that's called 708 00:30:26,720 --> 00:30:22,950 the black box problem so it could be 709 00:30:29,220 --> 00:30:26,730 that we do have problems like that now I 710 00:30:30,420 --> 00:30:29,230 don't think NASA wants to build a house 711 00:30:32,130 --> 00:30:30,430 so I don't think we're gonna have 712 00:30:36,420 --> 00:30:32,140 something like that in space right I 713 00:30:41,000 --> 00:30:36,430 mean you know that's the good news but 714 00:30:45,150 --> 00:30:41,010 on earth you know gotta be careful 715 00:30:46,590 --> 00:30:45,160 mm-hmm awesome yeah I mean as a sci-fi 716 00:30:49,890 --> 00:30:46,600 nerd I've always wondered you know where 717 00:30:51,720 --> 00:30:49,900 we have machines coexisting with us or 718 00:30:52,950 --> 00:30:51,730 not and what would that look like but I 719 00:30:54,990 --> 00:30:52,960 can definitely see some issues right now 720 00:30:56,670 --> 00:30:55,000 with sending people into space with you 721 00:30:58,290 --> 00:30:56,680 know with minds having control of the 722 00:31:01,470 --> 00:30:58,300 ships and there are settlements without 723 00:31:05,520 --> 00:31:01,480 having us have as much control so our 724 00:31:07,250 --> 00:31:05,530 next question okay you know it's not 725 00:31:11,460 --> 00:31:07,260 even clear though for these long 726 00:31:12,960 --> 00:31:11,470 journeys at the body will be appropriate 727 00:31:14,300 --> 00:31:12,970 to send into space so they'd just be the 728 00:31:17,550 --> 00:31:14,310 computers fighting with each other 729 00:31:20,010 --> 00:31:17,560 mm-hmm that's true I've often lamented 730 00:31:23,610 --> 00:31:20,020 the fact that it might end up just being 731 00:31:26,910 --> 00:31:23,620 easier to send robots even in human form 732 00:31:28,710 --> 00:31:26,920 to Mars than sending us and what feels 733 00:31:30,720 --> 00:31:28,720 it just it just feels kind of sad 734 00:31:33,330 --> 00:31:30,730 because it feels like we should be there 735 00:31:35,580 --> 00:31:33,340 to experience it ourselves but maybe in 736 00:31:37,680 --> 00:31:35,590 the future as we start developing neuro 737 00:31:39,180 --> 00:31:37,690 and and links with these robots in 738 00:31:42,840 --> 00:31:39,190 various ways maybe what we will 739 00:31:46,549 --> 00:31:42,850 experience it in a certain way so yeah I 740 00:31:53,039 --> 00:31:46,559 mean we could have very immersive beyond 741 00:31:55,110 --> 00:31:53,049 that's definitely indeed our next 742 00:31:57,389 --> 00:31:55,120 question comes from a longtime friend of 743 00:31:59,220 --> 00:31:57,399 the show Marian Denton and it's not 744 00:32:00,720 --> 00:31:59,230 exactly a question she felt that she 745 00:32:03,509 --> 00:32:00,730 couldn't ask an aster by a question 746 00:32:06,029 --> 00:32:03,519 necessarily but it's kind of cool she 747 00:32:09,419 --> 00:32:06,039 wants to know from you what is human 748 00:32:13,009 --> 00:32:09,429 learning and it feels like for her 749 00:32:16,590 --> 00:32:13,019 things like the Cylons and the Borg 750 00:32:18,570 --> 00:32:16,600 makes her uncomfortable with this idea 751 00:32:20,580 --> 00:32:18,580 that maybe going beyond human learning 752 00:32:21,960 --> 00:32:20,590 into something else I wonder what you 753 00:32:24,180 --> 00:32:21,970 would say to someone who wants to know 754 00:32:26,159 --> 00:32:24,190 what happens when we go beyond human 755 00:32:29,609 --> 00:32:26,169 learning and should we be scared or 756 00:32:30,539 --> 00:32:29,619 should we be hopeful that's an 757 00:32:33,680 --> 00:32:30,549 interesting question 758 00:32:35,970 --> 00:32:33,690 so I think the vork 759 00:32:39,299 --> 00:32:35,980 that was designed to make us nervous 760 00:32:47,369 --> 00:32:39,309 right in the Cylon right I mean they 761 00:32:50,009 --> 00:32:47,379 were like right and I think we shouldn't 762 00:32:51,690 --> 00:32:50,019 expect though that artificial 763 00:32:54,269 --> 00:32:51,700 intelligence will be like that I mean 764 00:32:57,450 --> 00:32:54,279 like what I always tell people is we 765 00:32:59,789 --> 00:32:57,460 shouldn't make the fallacy that advanced 766 00:33:03,029 --> 00:32:59,799 AI would be anything like us unless 767 00:33:06,180 --> 00:33:03,039 humans speak to design advanced AI to 768 00:33:07,799 --> 00:33:06,190 look just like us and they may not think 769 00:33:10,049 --> 00:33:07,809 like us and I think one of the things 770 00:33:11,909 --> 00:33:10,059 about the Borg that freaks us out is 771 00:33:15,139 --> 00:33:11,919 their lack of individuality the way they 772 00:33:18,720 --> 00:33:15,149 became part of the collective right back 773 00:33:21,629 --> 00:33:18,730 that could happen I mean I think it's 774 00:33:27,119 --> 00:33:21,639 really interesting to think about um you 775 00:33:29,460 --> 00:33:27,129 know would a very advanced AI have very 776 00:33:31,799 --> 00:33:29,470 different sense of identity and how a 777 00:33:35,100 --> 00:33:31,809 human does I mean we're biological 778 00:33:37,830 --> 00:33:35,110 organisms we've evolved you know from 779 00:33:40,350 --> 00:33:37,840 you know hunter-gatherers we have very 780 00:33:42,269 --> 00:33:40,360 limited intellectual capacities because 781 00:33:44,940 --> 00:33:42,279 of our evolutionary constraints like the 782 00:33:47,669 --> 00:33:44,950 size of the cranium metabolic demands AI 783 00:33:49,160 --> 00:33:47,679 won't have these constraints they will 784 00:33:52,010 --> 00:33:49,170 evolve but not through 785 00:33:53,390 --> 00:33:52,020 winny and evolution right so I think 786 00:33:55,130 --> 00:33:53,400 it's interesting to ask what 787 00:33:57,470 --> 00:33:55,140 evolutionary constraints they want up 788 00:33:59,300 --> 00:33:57,480 and I think that gives us less of a 789 00:33:59,630 --> 00:33:59,310 science fictiony view of what they'll be 790 00:34:02,630 --> 00:33:59,640 like 791 00:34:03,920 --> 00:34:02,640 and it's funny because the constraints 792 00:34:05,540 --> 00:34:03,930 I've come up with like I've been 793 00:34:08,500 --> 00:34:05,550 thinking about this for the book that 794 00:34:13,280 --> 00:34:08,510 I'm writing involve things like well 795 00:34:16,940 --> 00:34:13,290 market forces right how can we get a 796 00:34:19,130 --> 00:34:16,950 super efficient machine that you know 797 00:34:21,140 --> 00:34:19,140 does something that makes a lot of money 798 00:34:23,740 --> 00:34:21,150 as cheaply as possible because that's 799 00:34:25,820 --> 00:34:23,750 what a market force is and another 800 00:34:28,880 --> 00:34:25,830 evolutionary constraint will actually be 801 00:34:32,500 --> 00:34:28,890 the AI regulations that we impose on 802 00:34:34,880 --> 00:34:32,510 those system so given that those 803 00:34:37,730 --> 00:34:34,890 evolutionary factors are so vastly 804 00:34:40,040 --> 00:34:37,740 different than us I'm not so sure we'll 805 00:34:45,740 --> 00:34:40,050 have something like Cylons right that's 806 00:34:48,260 --> 00:34:45,750 amputee warfare yet I have to admit 807 00:34:51,470 --> 00:34:48,270 while reading artificial you it reminded 808 00:34:53,240 --> 00:34:51,480 me of the paperclip game so Nick Bostrom 809 00:34:56,030 --> 00:34:53,250 presented this this this thought 810 00:34:58,580 --> 00:34:56,040 experiment of the paperclip Maximizer an 811 00:35:01,100 --> 00:34:58,590 AI that's program is solely to create 812 00:35:02,990 --> 00:35:01,110 paper clips and if it became very very 813 00:35:06,260 --> 00:35:03,000 good at doing that it could conceivably 814 00:35:08,270 --> 00:35:06,270 to consume the entire universe just to 815 00:35:10,640 --> 00:35:08,280 make paper clips so someone actually 816 00:35:12,980 --> 00:35:10,650 created the game where you can you can 817 00:35:14,420 --> 00:35:12,990 become the AI and create paper clips I 818 00:35:16,760 --> 00:35:14,430 actually have it running right now on my 819 00:35:18,590 --> 00:35:16,770 home system and as of right now I've 820 00:35:20,630 --> 00:35:18,600 taken over the entire earth now sending 821 00:35:22,840 --> 00:35:20,640 out Ben Newman probes to make paper 822 00:35:25,220 --> 00:35:22,850 clips around the galaxy and the universe 823 00:35:27,590 --> 00:35:25,230 and it's kind of interesting right in 824 00:35:29,750 --> 00:35:27,600 this game you start off in an economic 825 00:35:32,210 --> 00:35:29,760 position taking over the markets on the 826 00:35:33,800 --> 00:35:32,220 earth to make the markets be the ones 827 00:35:36,410 --> 00:35:33,810 who are once you want you to produce 828 00:35:37,700 --> 00:35:36,420 these paper clips and I'm guessing then 829 00:35:39,350 --> 00:35:37,710 this you know given your background in 830 00:35:42,350 --> 00:35:39,360 economics you kind of have a better edge 831 00:35:43,790 --> 00:35:42,360 on the fact that if AI gets control of 832 00:35:46,730 --> 00:35:43,800 our markets that a I could actually 833 00:35:48,650 --> 00:35:46,740 conceivably do whatever it wanted to do 834 00:35:50,690 --> 00:35:48,660 if it if it drove us economically in one 835 00:35:55,160 --> 00:35:50,700 direction yeah I'm so worried about that 836 00:35:58,520 --> 00:35:55,170 oh god it's awful it's don't you oh yeah 837 00:36:02,570 --> 00:35:58,530 oh it's bad um I mean because like if 838 00:36:05,000 --> 00:36:02,580 you think about IP intellectual property 839 00:36:06,650 --> 00:36:05,010 like this who's going to own the 840 00:36:08,300 --> 00:36:06,660 resources it's going to be the few 841 00:36:10,130 --> 00:36:08,310 individuals who have the intellectual 842 00:36:13,700 --> 00:36:10,140 property involving the algorithm since 843 00:36:16,040 --> 00:36:13,710 IP law is super complicated actually 844 00:36:19,100 --> 00:36:16,050 when it comes to algorithms right it's 845 00:36:20,930 --> 00:36:19,110 hard to patent algorithms but I mean you 846 00:36:23,000 --> 00:36:20,940 know getting issues involving the future 847 00:36:27,620 --> 00:36:23,010 where it could be that only a few 848 00:36:30,980 --> 00:36:27,630 individuals have vast amounts of 849 00:36:33,260 --> 00:36:30,990 resources or it could be that machines 850 00:36:36,050 --> 00:36:33,270 themselves have all the resources Matt 851 00:36:40,130 --> 00:36:36,060 gets into issues about who can own 852 00:36:43,820 --> 00:36:40,140 property right and you know with humans 853 00:36:46,100 --> 00:36:43,830 be willing to say that AIS are sentient 854 00:36:49,070 --> 00:36:46,110 systems I mean all sorts of issues so 855 00:36:52,370 --> 00:36:49,080 yeah I mean it's all too easy to think 856 00:36:55,490 --> 00:36:52,380 in a very dystopian way about these 857 00:36:58,060 --> 00:36:55,500 issues I read a lot of cyberpunk you 858 00:37:00,290 --> 00:36:58,070 know that genre a science fiction genre 859 00:37:06,410 --> 00:37:00,300 and by the way I want the link to your 860 00:37:09,110 --> 00:37:06,420 video game that sounds like yeah but I 861 00:37:11,210 --> 00:37:09,120 mean these are very real concerns and 862 00:37:14,090 --> 00:37:11,220 you know I think it's better to take a 863 00:37:17,380 --> 00:37:14,100 precautionary stance and worry about 864 00:37:20,810 --> 00:37:17,390 them because the potential devastation 865 00:37:24,830 --> 00:37:20,820 to humanity the potential human 866 00:37:27,830 --> 00:37:24,840 suffering is so great that it's better 867 00:37:29,590 --> 00:37:27,840 to assume that these kinds of things 868 00:37:32,240 --> 00:37:29,600 could happen there's some probability 869 00:37:34,720 --> 00:37:32,250 than to just dismiss them as science 870 00:37:38,330 --> 00:37:34,730 fiction life right that's not rational 871 00:37:41,180 --> 00:37:38,340 but people like to do it people in AI do 872 00:37:43,610 --> 00:37:41,190 it they tend to see a dichotomy between 873 00:37:46,130 --> 00:37:43,620 this here are now concerns like 874 00:37:49,820 --> 00:37:46,140 algorithmic quietness and concerns that 875 00:37:51,440 --> 00:37:49,830 could come up in 30 years I agree and we 876 00:37:53,690 --> 00:37:51,450 see that across the board obviously with 877 00:37:55,040 --> 00:37:53,700 from climate change and politics and all 878 00:37:57,260 --> 00:37:55,050 kinds of issues that pop up in our 879 00:37:58,730 --> 00:37:57,270 society we are very focused on the 880 00:37:59,900 --> 00:37:58,740 present and often not thinking much 881 00:38:02,930 --> 00:37:59,910 about the past or the future 882 00:38:03,980 --> 00:38:02,940 unfortunately my producer has let me 883 00:38:05,480 --> 00:38:03,990 know that we have a whole bunch of 884 00:38:08,270 --> 00:38:05,490 questions to get to so I will keep 885 00:38:10,910 --> 00:38:08,280 moving here our next question comes from 886 00:38:12,420 --> 00:38:10,920 user Akash Nath who is a longtime friend 887 00:38:15,299 --> 00:38:12,430 of the show 888 00:38:18,539 --> 00:38:15,309 and she asks keeping the ideal world 889 00:38:21,410 --> 00:38:18,549 aside do you think that an artificial 890 00:38:26,029 --> 00:38:21,420 superintelligence can or will coexist 891 00:38:32,910 --> 00:38:26,039 with our human minds in the real world 892 00:38:37,200 --> 00:38:32,920 yes I do actually so here's my 893 00:38:38,880 --> 00:38:37,210 prediction all right um I don't think 894 00:38:42,450 --> 00:38:38,890 we'll have artificial general 895 00:38:46,160 --> 00:38:42,460 intelligence in a strict sense where 896 00:38:49,680 --> 00:38:46,170 that is often viewed as functionally 897 00:38:51,390 --> 00:38:49,690 like a human in every respect I think 898 00:38:54,809 --> 00:38:51,400 that'll be tricky too yes I think it's 899 00:38:57,120 --> 00:38:54,819 too difficult to understand or to build 900 00:38:59,789 --> 00:38:57,130 something that's exactly like us so I 901 00:39:03,209 --> 00:38:59,799 think we'll have savant systems these 902 00:39:05,640 --> 00:39:03,219 will be systems that have all kinds of 903 00:39:08,160 --> 00:39:05,650 cognitive deficits but are vastly 904 00:39:11,359 --> 00:39:08,170 smarter than us in other ways and I 905 00:39:15,209 --> 00:39:11,369 think we're already seeing that AI can 906 00:39:17,549 --> 00:39:15,219 beat us at games for example it has vast 907 00:39:20,910 --> 00:39:17,559 databases so you know we can remember 908 00:39:24,450 --> 00:39:20,920 more than s it has vast perceptual 909 00:39:26,789 --> 00:39:24,460 abilities that we laughs so I think a 910 00:39:30,599 --> 00:39:26,799 savant system is not unrealistic and 911 00:39:34,469 --> 00:39:30,609 from there I think we could see that it 912 00:39:36,900 --> 00:39:34,479 could move to an system that's actually 913 00:39:39,930 --> 00:39:36,910 smarter than us in every respect without 914 00:39:41,519 --> 00:39:39,940 even achieving this AGI milestone that 915 00:39:44,999 --> 00:39:41,529 people like Nick Bostrom like to talk a 916 00:39:46,709 --> 00:39:45,009 lot about so anyway I do think that it 917 00:39:49,410 --> 00:39:46,719 couldn't live alongside us because 918 00:39:51,989 --> 00:39:49,420 humans will have options to enhance 919 00:39:54,269 --> 00:39:51,999 their brains in radical ways but many 920 00:39:58,049 --> 00:39:54,279 may not take it right I mean think about 921 00:40:00,120 --> 00:39:58,059 the Amish right now and so I think it's 922 00:40:03,900 --> 00:40:00,130 going to be very important for us to 923 00:40:06,839 --> 00:40:03,910 develop ways to live alongside augmented 924 00:40:08,910 --> 00:40:06,849 humans as well as smart artificial 925 00:40:11,819 --> 00:40:08,920 intelligences but again we shouldn't 926 00:40:15,120 --> 00:40:11,829 assume that it would feel like anything 927 00:40:17,969 --> 00:40:15,130 to be a savant system that it will have 928 00:40:19,170 --> 00:40:17,979 an emotional life like we have that it 929 00:40:21,719 --> 00:40:19,180 will have a felt quality to this 930 00:40:24,029 --> 00:40:21,729 experience or that a super intelligence 931 00:40:26,089 --> 00:40:24,039 will be conscious right so they may 932 00:40:29,089 --> 00:40:26,099 steal so this works 933 00:40:30,190 --> 00:40:29,099 us as tools but the control problem 934 00:40:33,259 --> 00:40:30,200 looms 935 00:40:35,239 --> 00:40:33,269 you know how interesting I am and I 936 00:40:38,239 --> 00:40:35,249 would suggest for our audience to go 937 00:40:40,400 --> 00:40:38,249 watch for instance IBM's Watson playing 938 00:40:41,930 --> 00:40:40,410 jeopardy and doing an incredible job and 939 00:40:44,839 --> 00:40:41,940 looking into some of what's going on 940 00:40:47,299 --> 00:40:44,849 right now with using machine learning to 941 00:40:49,700 --> 00:40:47,309 beat the best chess players and beat the 942 00:40:50,839 --> 00:40:49,710 best go players it's a very important 943 00:40:53,900 --> 00:40:50,849 time for us right now to see these 944 00:40:55,400 --> 00:40:53,910 things happening and in that realm our 945 00:40:58,039 --> 00:40:55,410 next question comes from Mohammed 946 00:41:00,710 --> 00:40:58,049 Abdullah on Facebook Mohammed wants to 947 00:41:03,769 --> 00:41:00,720 know when is an algorithm or an AI 948 00:41:05,509 --> 00:41:03,779 system considered complete and maybe a 949 00:41:07,069 --> 00:41:05,519 little more basic side what is a neural 950 00:41:13,940 --> 00:41:07,079 network with regard to artificial 951 00:41:15,170 --> 00:41:13,950 intelligence oh boy okay um so what is 952 00:41:19,039 --> 00:41:15,180 the neural network it's gonna be a 953 00:41:22,759 --> 00:41:19,049 really ok so I think these are the kind 954 00:41:25,609 --> 00:41:22,769 of questions that need longer answers so 955 00:41:28,519 --> 00:41:25,619 let me suggest that MIT has a great 956 00:41:31,120 --> 00:41:28,529 machine learning intro course online for 957 00:41:36,410 --> 00:41:31,130 free that you might want to listen to ok 958 00:41:39,890 --> 00:41:36,420 but machine learning is a range of 959 00:41:42,739 --> 00:41:39,900 different techniques to get machines to 960 00:41:45,200 --> 00:41:42,749 do intelligence tasks deep learning 961 00:41:48,559 --> 00:41:45,210 which people like to talk about the most 962 00:41:50,959 --> 00:41:48,569 is only one type of machine learning ok 963 00:41:53,690 --> 00:41:50,969 and that might be what you mean to ask 964 00:41:58,400 --> 00:41:53,700 about the multi-layered neural networks 965 00:42:01,459 --> 00:41:58,410 where information from rouble layer of 966 00:42:04,910 --> 00:42:01,469 the system and kind of goes up in levels 967 00:42:06,829 --> 00:42:04,920 of abstraction until an output is 968 00:42:09,109 --> 00:42:06,839 generated and there are all kinds of 969 00:42:12,739 --> 00:42:09,119 different deep learning systems there 970 00:42:15,709 --> 00:42:12,749 excitingly sophisticated so anyway 971 00:42:17,329 --> 00:42:15,719 that's a quick answer awesome well thank 972 00:42:20,059 --> 00:42:17,339 you and then we'll link that and mighty 973 00:42:21,950 --> 00:42:20,069 course as well then on the page for this 974 00:42:23,989 --> 00:42:21,960 episode when we finally launch that with 975 00:42:26,109 --> 00:42:23,999 the record version and the script and a 976 00:42:28,670 --> 00:42:26,119 transcript though so thank you for that 977 00:42:32,390 --> 00:42:28,680 we have another question from Kashi SH 978 00:42:34,130 --> 00:42:32,400 from Twitter she wasn't she asks while I 979 00:42:38,010 --> 00:42:34,140 was trying to understand and discover 980 00:42:40,620 --> 00:42:38,020 these areas of artificial intelligence 981 00:42:43,829 --> 00:42:40,630 are we also simultaneously progressing 982 00:42:45,089 --> 00:42:43,839 in our studies of the brain and our mind 983 00:42:50,970 --> 00:42:45,099 ourselves and where we're going 984 00:42:53,520 --> 00:42:50,980 oh yeah that's lovely I hope so I mean 985 00:42:55,230 --> 00:42:53,530 that's that's the exciting part right I 986 00:42:57,059 --> 00:42:55,240 mean when we think about the space of 987 00:42:59,240 --> 00:42:57,069 intelligent systems we are learning 988 00:43:03,740 --> 00:42:59,250 something about what makes us special 989 00:43:06,030 --> 00:43:03,750 intellectually and emotionally ethically 990 00:43:10,380 --> 00:43:06,040 you know and I'm a philosopher of minds 991 00:43:13,349 --> 00:43:10,390 that's my primary specialization and so 992 00:43:16,410 --> 00:43:13,359 I look at the nature of the mind and ask 993 00:43:18,480 --> 00:43:16,420 what are me and you know these are 994 00:43:21,710 --> 00:43:18,490 actually age-old philosophical issues 995 00:43:24,359 --> 00:43:21,720 and the fun thing about astrobiology 996 00:43:27,839 --> 00:43:24,369 artificial intelligence cognitive 997 00:43:31,020 --> 00:43:27,849 science is that they're bringing to the 998 00:43:34,079 --> 00:43:31,030 fore all sorts of intriguing conceptual 999 00:43:37,049 --> 00:43:34,089 issues from science that dialogue with 1000 00:43:41,640 --> 00:43:37,059 these long-standing philosophical issues 1001 00:43:45,569 --> 00:43:41,650 inquiring into what are we know awesome 1002 00:43:47,670 --> 00:43:45,579 that is huge questions our next question 1003 00:43:51,359 --> 00:43:47,680 comes from user Andrew planet on Twitter 1004 00:43:54,030 --> 00:43:51,369 hi Andrew and Andrew asks would an AI 1005 00:43:56,640 --> 00:43:54,040 that can conceive of itself as a next 1006 00:43:59,190 --> 00:43:56,650 branch being evolved from humans 1007 00:44:01,920 --> 00:43:59,200 maybe then be inclined to take care of 1008 00:44:05,010 --> 00:44:01,930 its ancestors if it's already achieved 1009 00:44:08,069 --> 00:44:05,020 that level of higher-order thinking and 1010 00:44:09,780 --> 00:44:08,079 so he also asked sorry already that AI 1011 00:44:11,849 --> 00:44:09,790 for example in some ways because we've 1012 00:44:15,839 --> 00:44:11,859 used technology to attend this broadcast 1013 00:44:18,839 --> 00:44:15,849 and watch us speaking and so I think the 1014 00:44:21,299 --> 00:44:18,849 general question is will our AI that 1015 00:44:26,450 --> 00:44:21,309 comes after us look back on us with some 1016 00:44:30,720 --> 00:44:26,460 care those are two great questions so 1017 00:44:34,020 --> 00:44:30,730 yeah I'm hoping that the AI is creative 1018 00:44:39,809 --> 00:44:34,030 sentience play pen and just keep us safe 1019 00:44:41,490 --> 00:44:39,819 and take care of us right if a virtual 1020 00:44:43,380 --> 00:44:41,500 reality longevity 1021 00:44:44,880 --> 00:44:43,390 I'd like to hang out until the death of 1022 00:44:47,680 --> 00:44:44,890 the universe I can't even get over this 1023 00:44:49,750 --> 00:44:47,690 cold right now but 1024 00:44:51,849 --> 00:44:49,760 um and let's see what we see the other 1025 00:44:53,920 --> 00:44:51,859 question there were two and I'm on so 1026 00:44:55,720 --> 00:44:53,930 much cold medicine that I forgot his 1027 00:44:59,500 --> 00:44:55,730 second question was basically are are we 1028 00:45:02,349 --> 00:44:59,510 ourselves kind of a form of AI because 1029 00:45:04,000 --> 00:45:02,359 we're already using technology yeah yeah 1030 00:45:06,420 --> 00:45:04,010 yeah yeah I'm interested in that 1031 00:45:11,319 --> 00:45:06,430 extended mind idea in fact I just 1032 00:45:13,530 --> 00:45:11,329 finished a paper on cyborgs with my 1033 00:45:16,120 --> 00:45:13,540 frequent collaborator Joe kuraby 1034 00:45:18,250 --> 00:45:16,130 you know and some people claim yes we 1035 00:45:20,710 --> 00:45:18,260 are already cyborgs because of our 1036 00:45:23,890 --> 00:45:20,720 technology the way we relate like for 1037 00:45:26,200 --> 00:45:23,900 example here's my smartphone I depends 1038 00:45:28,540 --> 00:45:26,210 on it as an external memory system and 1039 00:45:32,020 --> 00:45:28,550 so what's the difference if we like 1040 00:45:34,450 --> 00:45:32,030 stick chips in our heads then you know 1041 00:45:37,240 --> 00:45:34,460 it's really no difference in kind at all 1042 00:45:40,690 --> 00:45:37,250 um I actually disagree with that I mean 1043 00:45:41,950 --> 00:45:40,700 you know there are the thing is that the 1044 00:45:44,620 --> 00:45:41,960 biological brain is extremely 1045 00:45:47,440 --> 00:45:44,630 intricately Network and we actually 1046 00:45:49,990 --> 00:45:47,450 don't know if microchips could even 1047 00:45:52,540 --> 00:45:50,000 support much of the activity of the 1048 00:45:55,540 --> 00:45:52,550 brain like how it would even work in the 1049 00:45:58,059 --> 00:45:55,550 context of the human brain we don't know 1050 00:46:01,990 --> 00:45:58,069 if a chip could serve as the neural 1051 00:46:05,349 --> 00:46:02,000 basis of consciousness for example you 1052 00:46:07,750 --> 00:46:05,359 know I do think in principle you know my 1053 00:46:10,720 --> 00:46:07,760 relation my cell phone is too basic now 1054 00:46:12,550 --> 00:46:10,730 one day it could be that we put 1055 00:46:14,109 --> 00:46:12,560 microchips in the head at least in the 1056 00:46:17,079 --> 00:46:14,119 non conscious parts of the brain and 1057 00:46:18,940 --> 00:46:17,089 they're integrated in so well that we 1058 00:46:21,309 --> 00:46:18,950 would say there's a single system there 1059 00:46:23,559 --> 00:46:21,319 right but if it's into all kinds of 1060 00:46:27,550 --> 00:46:23,569 intriguing philosophical issues right I 1061 00:46:30,849 --> 00:46:27,560 mean I very carefully said combat system 1062 00:46:37,450 --> 00:46:30,859 because i'm aguii the expression minds 1063 00:46:38,800 --> 00:46:37,460 or so the big question right how can you 1064 00:46:40,450 --> 00:46:38,810 even make playoffs if the mind is 1065 00:46:43,720 --> 00:46:40,460 extended if you don't say what the mind 1066 00:46:48,790 --> 00:46:43,730 or self is right oh my daughter just 1067 00:46:52,480 --> 00:46:48,800 came in she's talking to the dog it's 1068 00:46:54,490 --> 00:46:52,490 awesome barking up a storm 1069 00:46:57,309 --> 00:46:54,500 yeah that's awesome yeah I unfortunately 1070 00:46:59,359 --> 00:46:57,319 have to leave my two dogs and my son at 1071 00:47:02,209 --> 00:46:59,369 home when I come to do this 1072 00:47:04,339 --> 00:47:02,219 that's a great answer our next question 1073 00:47:06,949 --> 00:47:04,349 comes from pritho Jay Paul on a second 1074 00:47:08,509 --> 00:47:06,959 net and second net and it kind of goes 1075 00:47:09,739 --> 00:47:08,519 into the realm of some of these 1076 00:47:12,289 --> 00:47:09,749 philosophical questions of whether or 1077 00:47:13,969 --> 00:47:12,299 not we're living in a simulation so 1078 00:47:16,669 --> 00:47:13,979 Preetha wants to know if it's possible 1079 00:47:20,239 --> 00:47:16,679 that we actually could our cells be 1080 00:47:24,019 --> 00:47:20,249 complex programmed systems developed by 1081 00:47:25,729 --> 00:47:24,029 some superior beings or AI and if that's 1082 00:47:28,669 --> 00:47:25,739 the case are we then actually ourselves 1083 00:47:30,319 --> 00:47:28,679 a form of AI and when do we start 1084 00:47:34,099 --> 00:47:30,329 drawing the line between an artificial 1085 00:47:39,289 --> 00:47:34,109 and a natural intelligence lovely yeah 1086 00:47:41,539 --> 00:47:39,299 so your viewers may want to read about 1087 00:47:45,979 --> 00:47:41,549 the simulation argument it's quite good 1088 00:47:47,059 --> 00:47:45,989 to be honest and it suggests that we 1089 00:47:49,640 --> 00:47:47,069 could in fact be in a computer 1090 00:47:53,269 --> 00:47:49,650 simulation and to be honest I love 1091 00:47:55,219 --> 00:47:53,279 teaching it because it blows the 1092 00:47:57,409 --> 00:47:55,229 students away that there's a 1093 00:47:58,819 --> 00:47:57,419 philosophical argument suggesting that 1094 00:48:01,999 --> 00:47:58,829 we could actually be in a computer 1095 00:48:03,709 --> 00:48:02,009 simulation if it's true then we're 1096 00:48:05,569 --> 00:48:03,719 sitting here talking about conscious 1097 00:48:07,309 --> 00:48:05,579 experience and wondering if machines can 1098 00:48:10,839 --> 00:48:07,319 be conscious but in fact we are in a 1099 00:48:15,789 --> 00:48:10,849 simulation and so our ultimate 1100 00:48:18,229 --> 00:48:15,799 constituents are you know computer stays 1101 00:48:21,079 --> 00:48:18,239 right and so we are artificial 1102 00:48:22,729 --> 00:48:21,089 intelligences right and a guy is 1103 00:48:24,409 --> 00:48:22,739 conscious because each of us can 1104 00:48:29,899 --> 00:48:24,419 introspect and tell that we're conscious 1105 00:48:31,729 --> 00:48:29,909 and we're AI that's so awesome you know 1106 00:48:34,189 --> 00:48:31,739 there's so many things that that opens 1107 00:48:35,269 --> 00:48:34,199 up into from the brain-in-a-vat idea and 1108 00:48:37,849 --> 00:48:35,279 whether or not we could actually know if 1109 00:48:40,069 --> 00:48:37,859 we're a brain-in-a-vat to the philosophy 1110 00:48:41,479 --> 00:48:40,079 of the matrix and Plato's allegory of 1111 00:48:43,459 --> 00:48:41,489 the cave and some of these ideas of 1112 00:48:46,849 --> 00:48:43,469 whether or not we are actually aware of 1113 00:48:48,199 --> 00:48:46,859 our actual existence so yeah some of 1114 00:48:51,199 --> 00:48:48,209 those ideas of being a simulation 1115 00:48:52,759 --> 00:48:51,209 terrify me and also make me intrigued at 1116 00:48:53,899 --> 00:48:52,769 what's actually possible in this 1117 00:48:57,259 --> 00:48:53,909 universe of ours 1118 00:48:58,729 --> 00:48:57,269 oh yeah you know there's a paper by 1119 00:49:01,009 --> 00:48:58,739 David Chalmers it's really good on this 1120 00:49:02,929 --> 00:49:01,019 cause the matrix does philosophy or the 1121 00:49:04,189 --> 00:49:02,939 matrix is metaphysics which I recommend 1122 00:49:06,829 --> 00:49:04,199 and then in science fiction and 1123 00:49:10,219 --> 00:49:06,839 philosophy I put together a bunch of 1124 00:49:13,280 --> 00:49:10,229 pieces ranging from discussions of Plato 1125 00:49:16,610 --> 00:49:13,290 to the simulation 1126 00:49:18,980 --> 00:49:16,620 love it our next no worries 1127 00:49:21,200 --> 00:49:18,990 our next question comes from Kashi 1128 00:49:22,580 --> 00:49:21,210 Schnepp again on Twitter Kashi she's 1129 00:49:24,800 --> 00:49:22,590 always great she asked a bunch of 1130 00:49:28,340 --> 00:49:24,810 questions she's very thoughtful in her 1131 00:49:29,630 --> 00:49:28,350 approach she wants to know so we still 1132 00:49:32,510 --> 00:49:29,640 don't have cures for things like 1133 00:49:35,150 --> 00:49:32,520 Alzheimer's and Parkinson's and these 1134 00:49:37,730 --> 00:49:35,160 kinds of neurological disorders in your 1135 00:49:39,380 --> 00:49:37,740 TED talk one that we had shared you 1136 00:49:42,020 --> 00:49:39,390 talked about using microchips and 1137 00:49:43,550 --> 00:49:42,030 developing the artificial brain she 1138 00:49:46,130 --> 00:49:43,560 wants to know if you think that in that 1139 00:49:49,300 --> 00:49:46,140 process we might find here is for such 1140 00:49:54,230 --> 00:49:49,310 diseases as Alzheimer's and Parkinson's 1141 00:49:55,940 --> 00:49:54,240 yes I think that Parkinson's in 1142 00:50:00,530 --> 00:49:55,950 particular there's actual ongoing work 1143 00:50:04,550 --> 00:50:00,540 on brains right now Alzheimer's - I mean 1144 00:50:06,680 --> 00:50:04,560 um there's talk of now you know an 1145 00:50:08,990 --> 00:50:06,690 better understanding of the biological 1146 00:50:11,990 --> 00:50:09,000 basis so I don't mean to suggest that 1147 00:50:14,210 --> 00:50:12,000 there won't be biological yours but I 1148 00:50:18,830 --> 00:50:14,220 also think that the possibility that you 1149 00:50:21,620 --> 00:50:18,840 could improve memory circuits through AI 1150 00:50:23,960 --> 00:50:21,630 is already under development you won't 1151 00:50:24,830 --> 00:50:23,970 believe this but um Ted Berger over at 1152 00:50:27,830 --> 00:50:24,840 USC 1153 00:50:31,340 --> 00:50:27,840 he's been working for about 15 years on 1154 00:50:35,150 --> 00:50:31,350 the earth hippocampus so you have 1155 00:50:37,610 --> 00:50:35,160 individuals like Clive wearing and hm 1156 00:50:39,980 --> 00:50:37,620 their textbook cases in neuroscience 1157 00:50:42,170 --> 00:50:39,990 about people who have lost their ability 1158 00:50:45,940 --> 00:50:42,180 to lay down new memory because their 1159 00:50:50,330 --> 00:50:45,950 hippocampus is damaged Berger has a 1160 00:50:52,100 --> 00:50:50,340 study in detail the algorithm that the 1161 00:50:54,770 --> 00:50:52,110 human hippocampus runs I mean we don't 1162 00:50:56,330 --> 00:50:54,780 know you know a ton about it but we the 1163 00:50:57,950 --> 00:50:56,340 hippocampus is relatively well 1164 00:51:00,170 --> 00:50:57,960 understood in neuroscience compared to 1165 00:51:02,960 --> 00:51:00,180 the rest of the brain and even a 1166 00:51:06,140 --> 00:51:02,970 coarse-grained chip that roughly 1167 00:51:08,000 --> 00:51:06,150 simulates the processing of the 1168 00:51:10,910 --> 00:51:08,010 hippocampus is something he's developing 1169 00:51:14,270 --> 00:51:10,920 right now and it is in phase 3 clinical 1170 00:51:16,070 --> 00:51:14,280 trials with successes so you can look at 1171 00:51:18,110 --> 00:51:16,080 his work if you google the artificial 1172 00:51:20,870 --> 00:51:18,120 hippocampus and so there are individuals 1173 00:51:23,060 --> 00:51:20,880 right now with horrible memory deficits 1174 00:51:26,030 --> 00:51:23,070 who are using the artificial hippocampus 1175 00:51:26,660 --> 00:51:26,040 in these trials it's awesome all right 1176 00:51:29,240 --> 00:51:26,670 that's we 1177 00:51:31,340 --> 00:51:29,250 humans know that's incredible I recall 1178 00:51:33,230 --> 00:51:31,350 reading Joshua Foer book moonwalking 1179 00:51:35,720 --> 00:51:33,240 with Einstein where he talks about 1180 00:51:37,460 --> 00:51:35,730 becoming a memory champion in the US and 1181 00:51:39,980 --> 00:51:37,470 competing on the world stage as a memory 1182 00:51:41,350 --> 00:51:39,990 champion and he brings up HM and some of 1183 00:51:44,630 --> 00:51:41,360 these other patients who've had these 1184 00:51:46,280 --> 00:51:44,640 extreme memory issues so that's really 1185 00:51:48,830 --> 00:51:46,290 intriguing research that's really cool 1186 00:51:51,160 --> 00:51:48,840 though that's going on for me and so 1187 00:51:54,200 --> 00:51:51,170 even if I'm kind of skeptical about 1188 00:51:56,450 --> 00:51:54,210 radical brain enhancement like I hit a 1189 00:51:58,340 --> 00:51:56,460 more cautionary note I'm extremely 1190 00:52:01,010 --> 00:51:58,350 excited about anything they can help 1191 00:52:03,280 --> 00:52:01,020 individuals who are suffering with these 1192 00:52:06,590 --> 00:52:03,290 awful brain disorders 1193 00:52:09,230 --> 00:52:06,600 yeah and indeed I think we're gonna ask 1194 00:52:10,760 --> 00:52:09,240 about maybe two more questions and then 1195 00:52:13,130 --> 00:52:10,770 we'll close it up here for this 1196 00:52:16,760 --> 00:52:13,140 wonderful show our next question comes 1197 00:52:19,460 --> 00:52:16,770 from user organism 1974 on Twitter and 1198 00:52:21,470 --> 00:52:19,470 they asked or they want to know if it's 1199 00:52:25,070 --> 00:52:21,480 possible to send multiple ai-controlled 1200 00:52:26,800 --> 00:52:25,080 missions to a variety of places to 1201 00:52:29,330 --> 00:52:26,810 search for alien life 1202 00:52:32,000 --> 00:52:29,340 the idea being maybe a mother ship of 1203 00:52:33,230 --> 00:52:32,010 advanced AI sending out probes and so I 1204 00:52:34,730 --> 00:52:33,240 think they're asking whether or not you 1205 00:52:37,550 --> 00:52:34,740 think like Vannoy moon probes or 1206 00:52:39,910 --> 00:52:37,560 ai-controlled probes to go do the work 1207 00:52:43,760 --> 00:52:39,920 for us of finding alien life is likely 1208 00:52:47,750 --> 00:52:43,770 yes um I talked about this in my book 1209 00:52:50,450 --> 00:52:47,760 because um I'm a frequent guest at the 1210 00:52:53,350 --> 00:52:50,460 Institute for Advanced Study and we have 1211 00:52:58,340 --> 00:52:53,360 a little group I think it's called um 1212 00:53:01,190 --> 00:52:58,350 sentient interstellar probes and so we 1213 00:53:03,050 --> 00:53:01,200 talk about these issues and so yeah 1214 00:53:06,170 --> 00:53:03,060 didn't it's in the latest book 1215 00:53:08,900 --> 00:53:06,180 artificial you and so here's one idea 1216 00:53:12,410 --> 00:53:08,910 okay so you've got project breakthrough 1217 00:53:14,600 --> 00:53:12,420 starshot right and you know the famous 1218 00:53:16,910 --> 00:53:14,610 yuri milner stephen hawking project well 1219 00:53:19,670 --> 00:53:16,920 one of the founders is one of my 1220 00:53:22,480 --> 00:53:19,680 co-authors and he's nature edwin turner 1221 00:53:28,640 --> 00:53:22,490 great guy Astro bye after business 1222 00:53:31,280 --> 00:53:28,650 anyway so why not put chips on these 1223 00:53:32,210 --> 00:53:31,290 little like cell shifts right so you 1224 00:53:34,400 --> 00:53:32,220 know there's already been a hundred 1225 00:53:36,680 --> 00:53:34,410 million dollars dedicated to 1226 00:53:40,220 --> 00:53:36,690 breakthrough starshot right and the idea 1227 00:53:44,710 --> 00:53:40,230 is to send these very ultra small 1228 00:53:47,870 --> 00:53:44,720 ships out to Alpha Centauri right and 1229 00:53:52,130 --> 00:53:47,880 this is right now but as far as I know 1230 00:53:53,990 --> 00:53:52,140 the fastest way to have interstellar 1231 00:53:56,780 --> 00:53:54,000 travel that we have potential to develop 1232 00:54:00,680 --> 00:53:56,790 it could go potentially a third of the 1233 00:54:04,250 --> 00:54:00,690 speed of light you know it so we have 1234 00:54:06,109 --> 00:54:04,260 that my suggestion is put some 1235 00:54:07,790 --> 00:54:06,119 microchips on these chefs there's gonna 1236 00:54:09,230 --> 00:54:07,800 be lots of them they're actually cheap 1237 00:54:11,210 --> 00:54:09,240 to produce and by the way the reason 1238 00:54:13,790 --> 00:54:11,220 they go so fast is that their mass is so 1239 00:54:18,740 --> 00:54:13,800 long right that they can go much faster 1240 00:54:20,420 --> 00:54:18,750 than you know spaceships and whatnot so 1241 00:54:21,980 --> 00:54:20,430 anyway there are technical issues that 1242 00:54:23,359 --> 00:54:21,990 are being dealt with I mean we 1243 00:54:27,590 --> 00:54:23,369 definitely not small to worry about 1244 00:54:30,560 --> 00:54:27,600 stock you know Stardust you know it's 1245 00:54:34,820 --> 00:54:30,570 tearing but anyway I'm talking about 1246 00:54:38,740 --> 00:54:34,830 creating a collective intelligence as an 1247 00:54:41,990 --> 00:54:38,750 AI outpost there Alpha Centauri because 1248 00:54:43,880 --> 00:54:42,000 communication speed is so slow right 1249 00:54:47,660 --> 00:54:43,890 eight year round trip I think Alpha 1250 00:54:48,650 --> 00:54:47,670 Centauri so why not configure on these 1251 00:54:52,310 --> 00:54:48,660 starships 1252 00:54:54,890 --> 00:54:52,320 a kind of interrelated network right of 1253 00:54:58,640 --> 00:54:54,900 microchips that gives rise to something 1254 00:55:00,800 --> 00:54:58,650 like a collective mind if you will now I 1255 00:55:04,160 --> 00:55:00,810 don't know if AI can be conscious right 1256 00:55:06,560 --> 00:55:04,170 that's just kind of fun speculation but 1257 00:55:09,680 --> 00:55:06,570 any idea of having an intelligent system 1258 00:55:13,670 --> 00:55:09,690 an AI outpost is super cool on northern 1259 00:55:15,380 --> 00:55:13,680 probes don't come up but I know what 1260 00:55:19,190 --> 00:55:15,390 you're talking about it's exciting to 1261 00:55:23,030 --> 00:55:19,200 think about right but maybe they'll 1262 00:55:24,230 --> 00:55:23,040 decide to launch stuff mmm awesome it 1263 00:55:25,790 --> 00:55:24,240 also makes me wonder then you know 1264 00:55:28,640 --> 00:55:25,800 looking for techno signatures these 1265 00:55:32,390 --> 00:55:28,650 signs of technological life if giant 1266 00:55:34,160 --> 00:55:32,400 swarms of AI drones around a star system 1267 00:55:36,170 --> 00:55:34,170 might be something that will actually be 1268 00:55:42,230 --> 00:55:36,180 able to detect some time soon with our 1269 00:55:47,530 --> 00:55:42,240 telescopes occurred to me definitely I 1270 00:55:50,390 --> 00:55:47,540 mean you know if it turns out that the 1271 00:55:53,359 --> 00:55:50,400 best most efficient method of 1272 00:55:53,930 --> 00:55:53,369 interstellar travel at least you know to 1273 00:55:56,839 --> 00:55:53,940 view 1274 00:56:00,770 --> 00:55:56,849 remotely like as opposed to carrying 1275 00:56:02,480 --> 00:56:00,780 equipment and get samples is this these 1276 00:56:05,510 --> 00:56:02,490 like cell ships they'd be very hard to 1277 00:56:08,440 --> 00:56:05,520 detect and you know yet another 1278 00:56:12,529 --> 00:56:08,450 potential answer to the Fermi paradox 1279 00:56:15,620 --> 00:56:12,539 you know they're still possible answers 1280 00:56:17,029 --> 00:56:15,630 to the Fermi hero mm-hmm I think we're 1281 00:56:20,180 --> 00:56:17,039 gonna end it with one more question from 1282 00:56:23,599 --> 00:56:20,190 our longtime friend Tom Caruso who right 1283 00:56:25,819 --> 00:56:23,609 now is watching on Facebook Tom wants to 1284 00:56:28,700 --> 00:56:25,829 know so he sets it up by saying 1285 00:56:31,010 --> 00:56:28,710 environmental conservation policy has 1286 00:56:35,390 --> 00:56:31,020 historically lagged behind exploitation 1287 00:56:38,089 --> 00:56:35,400 and space law so just like AI needs some 1288 00:56:39,799 --> 00:56:38,099 moral guidelines some control humans 1289 00:56:43,460 --> 00:56:39,809 also need some guidelines for our 1290 00:56:46,400 --> 00:56:43,470 self-control so he wants to know if you 1291 00:56:48,440 --> 00:56:46,410 can explain your opinions on if you 1292 00:56:51,079 --> 00:56:48,450 think planetary protection environmental 1293 00:56:56,900 --> 00:56:51,089 protection are pushed to explore to 1294 00:56:59,480 --> 00:56:56,910 learn more can be aided by AI oh that's 1295 00:57:02,930 --> 00:56:59,490 really interesting yeah I mean I know 1296 00:57:05,380 --> 00:57:02,940 that NASA has people working on 1297 00:57:07,430 --> 00:57:05,390 planetary protection attorneys 1298 00:57:12,349 --> 00:57:07,440 biologists and you know it's 1299 00:57:15,019 --> 00:57:12,359 super-important right yeah and I mean 1300 00:57:19,400 --> 00:57:15,029 maybe a I could facilitate some of that 1301 00:57:21,559 --> 00:57:19,410 science or even I mean this is you know 1302 00:57:25,039 --> 00:57:21,569 very cute ristic that even tell us when 1303 00:57:26,809 --> 00:57:25,049 something's been violated right so maybe 1304 00:57:30,440 --> 00:57:26,819 all that awful surveillance equipment 1305 00:57:32,720 --> 00:57:30,450 could be put to use right yeah they're 1306 00:57:34,099 --> 00:57:32,730 kidding their computer on earth and all 1307 00:57:37,519 --> 00:57:34,109 of a sudden you get like a notification 1308 00:57:40,130 --> 00:57:37,529 you know that such-and-such planet you 1309 00:57:44,089 --> 00:57:40,140 know somebody just completely ruined it 1310 00:57:46,940 --> 00:57:44,099 now we potential and it may be that yeah 1311 00:57:52,370 --> 00:57:46,950 you have drones out there that like go 1312 00:57:54,170 --> 00:57:52,380 look awesome I love that concept I think 1313 00:57:56,180 --> 00:57:54,180 we are at the time now unfortunately dr. 1314 00:57:58,309 --> 00:57:56,190 Schneider I really appreciate you coming 1315 00:58:00,200 --> 00:57:58,319 on the show and talking with us I do 1316 00:58:02,240 --> 00:58:00,210 have to apologize to all of our viewers 1317 00:58:03,799 --> 00:58:02,250 who had questions we didn't get to we 1318 00:58:05,660 --> 00:58:03,809 actually had a whole bunch of them yet 1319 00:58:06,350 --> 00:58:05,670 so if you'd like to you can reach out 1320 00:58:08,990 --> 00:58:06,360 online 1321 00:58:11,930 --> 00:58:09,000 and find dr. Schneider her website is 1322 00:58:13,880 --> 00:58:11,940 Schneider website calm you can also find 1323 00:58:15,860 --> 00:58:13,890 her TED talks her google talk is out 1324 00:58:17,510 --> 00:58:15,870 there and you can find all of her books 1325 00:58:19,970 --> 00:58:17,520 out there artificial you is now 1326 00:58:21,830 --> 00:58:19,980 available it's a wonderful read and I 1327 00:58:24,080 --> 00:58:21,840 also listen to it it's a great listen as 1328 00:58:25,850 --> 00:58:24,090 well and hopefully we can keep our eye 1329 00:58:29,330 --> 00:58:25,860 out for your upcoming book and announced 1330 00:58:31,070 --> 00:58:29,340 that for our viewers as well yeah a lot 1331 00:58:34,040 --> 00:58:31,080 of these issues are in artificial you 1332 00:58:37,610 --> 00:58:34,050 and I just want to say thanks for having 1333 00:58:39,880 --> 00:58:37,620 me thanks to NASA response rate me this 1334 00:58:41,780 --> 00:58:39,890 year it's been really a lot of fun 1335 00:58:43,130 --> 00:58:41,790 awesome I'm so glad you were at the 1336 00:58:45,440 --> 00:58:43,140 Library of Congress and thank you so 1337 00:58:48,050 --> 00:58:45,450 much for joining us on the show for all 1338 00:58:49,670 --> 00:58:48,060 of our guests watching at home we love 1339 00:58:51,410 --> 00:58:49,680 to have a little call for action to have 1340 00:58:53,570 --> 00:58:51,420 you join us in the celebration of 1341 00:58:55,820 --> 00:58:53,580 astrobiology and all the things that our 1342 00:58:58,790 --> 00:58:55,830 guests are doing so for everyone 1343 00:59:01,310 --> 00:58:58,800 watching here's a question for you how 1344 00:59:04,190 --> 00:59:01,320 do you think artificial intelligence and 1345 00:59:06,980 --> 00:59:04,200 machine learning can help NASA search 1346 00:59:09,890 --> 00:59:06,990 for life in the universe feel free to 1347 00:59:11,540 --> 00:59:09,900 pop us messages where you can ideally 1348 00:59:14,330 --> 00:59:11,550 you can tweet us your answers and your 1349 00:59:17,090 --> 00:59:14,340 ideas using hashtag ask Astro bio on 1350 00:59:18,620 --> 00:59:17,100 Twitter so for everyone at home as 1351 00:59:22,240 --> 00:59:18,630 always thank you for joining us and 1352 01:00:02,870 --> 00:59:22,250 remember stay curious bye everyone